Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[MXNET-1121] Example to demonstrate the inference workflow using RNN #13680

Merged
merged 35 commits into from
Feb 7, 2019
Merged
Show file tree
Hide file tree
Changes from 7 commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
9519a2d
[MXNET-1121] Example to demonstrate the inference workflow using RNN
leleamol Dec 18, 2018
8c06090
Merge branch 'master' of https://github.com/apache/incubator-mxnet in…
leleamol Jan 9, 2019
7a341f9
Addressed the review comments. Updated the ReadMe files.
leleamol Jan 9, 2019
c19760a
Removed the unnecessary creation of NDArray
leleamol Jan 9, 2019
438c3c4
Added the unit tests to nightly tests to catch the failure.
leleamol Jan 10, 2019
c2f7a67
Updated the makefiles and unit tests so that the examples are built a…
leleamol Jan 11, 2019
768fe5e
Added the visual representation of the model and fixed the CI failure.
leleamol Jan 11, 2019
d8abf83
Added the missing pdf file.
leleamol Jan 12, 2019
fd33d22
Fixing the broken ci_test.sh
leleamol Jan 12, 2019
e31a5bd
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
c53d329
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
9d315a2
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
c774f3c
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
8d88feb
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
6d631b3
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
0d00c74
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
03744a3
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
ff5fca3
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
4ffd4a9
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
bd6fad5
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
d5119c2
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
4bafe95
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
1da9482
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
4393f18
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
45cbba9
Applying unresolved changes to README file.
leleamol Jan 12, 2019
bf48c42
Fixing the CI build failure.
leleamol Jan 14, 2019
bd35b40
Updated the RNN example from sequence generation to sentiment analysis
leleamol Jan 19, 2019
b198339
Updated the readme files. Updated the example to use trained model an…
leleamol Jan 21, 2019
4897901
Addressed the review comment to increase the default sequence length.…
leleamol Jan 23, 2019
bb14d79
Updated the example to handle variable length input. Updated the read…
leleamol Jan 28, 2019
a487ca9
Updated the example to share the memory between executors by createin…
leleamol Jan 30, 2019
c3cace1
Updated the creation of executors from largest to smallest bucket key
leleamol Feb 5, 2019
7d4a173
Creating the executor for the highest bucket key.
leleamol Feb 5, 2019
b1e074f
Updated the unit test to check for the results in a range and modifie…
leleamol Feb 6, 2019
0f155e9
Fixed the logic to find the right bucket.
leleamol Feb 6, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 4 additions & 5 deletions cpp-package/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ The users of these bindings are required to build this package as mentioned belo
The cpp-package directory contains the implementation of C++ API. As mentioned above, users are required to build this directory or package before using it.
**The cpp-package is built while building the MXNet shared library, *libmxnet.so*.**

###Steps to build the C++ package:
### Steps to build the C++ package:
1. Building the MXNet C++ package requires building MXNet from source.
2. Clone the MXNet GitHub repository **recursively** to ensure the code in submodules is available for building MXNet.
```
Expand All @@ -17,10 +17,10 @@ The cpp-package directory contains the implementation of C++ API. As mentioned a

3. Install the [prerequisites](<https://mxnet.incubator.apache.org/install/build_from_source#prerequisites>), desired [BLAS libraries](<https://mxnet.incubator.apache.org/install/build_from_source#blas-library>) and optional [OpenCV, CUDA, and cuDNN](<https://mxnet.incubator.apache.org/install/build_from_source#optional>) for building MXNet from source.
4. There is a configuration file for make, [make/config.mk](<https://github.com/apache/incubator-mxnet/blob/master/make/config.mk>) that contains all the compilation options. You can edit this file and set the appropriate options prior to running the **make** command.
5. Please refer to [platform specific build instructions](<https://mxnet.incubator.apache.org/install/build_from_source#build-instructions-by-operating-system>) and available [build configurations](https://mxnet.incubator.apache.org/install/build_from_source#build-configurations) for more details.
5. Please refer to [platform specific build instructions](<https://mxnet.incubator.apache.org/install/build_from_source#build-instructions-by-operating-system>) and available [build configurations](https://mxnet.incubator.apache.org/install/build_from_source#build-configurations) for more details.
5. For enabling the build of C++ Package, set the **USE\_CPP\_PACKAGE = 1** in [make/config.mk](<https://github.com/apache/incubator-mxnet/blob/master/make/config.mk>). Optionally, the compilation flag can also be specified on **make** command line as follows.
```
make -j USE_CPP_PACKAGE=1
make -j USE_CPP_PACKAGE=1
```

## Usage
Expand All @@ -42,5 +42,4 @@ A basic tutorial can be found at <https://mxnet.incubator.apache.org/tutorials/c

## Examples

The example directory contains examples for you to get started.

The example directory contains examples for you to get started. Please build the MXNet C++ Package before building the examples.
2 changes: 1 addition & 1 deletion cpp-package/cpp-package.mk
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,4 @@ cpp-package-lint:
(cd cpp-package; python scripts/lint.py dmlc ${LINT_LANG} include example)

include cpp-package/example/example.mk

include cpp-package/example/inference/inference.mk
3 changes: 2 additions & 1 deletion cpp-package/example/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@
## Building C++ examples

The examples in this folder demonstrate the **training** workflow. The **inference workflow** related examples can be found in [inference](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference>) folder.
The examples in this folder are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows
Please build the MXNet C++ Package as explained in the [README](<https://github.com/apache/incubator-mxnet/tree/master/cpp-package#building-c-package>) File before building these examples manually.
The examples in this folder are built while building the MXNet library and cpp-package from source. However, they can be built manually as follows

From cpp-package/examples directory

Expand Down
48 changes: 47 additions & 1 deletion cpp-package/example/inference/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Building C++ Inference examples

The examples in this folder demonstrate the **inference** workflow.
The examples in this folder demonstrate the **inference** workflow. Please build the MXNet C++ Package as explained in the [README](<https://github.com/apache/incubator-mxnet/tree/master/cpp-package#building-c-package>) File before building these examples.
To build examples use following commands:

- Release: **make all**
Expand Down Expand Up @@ -39,3 +39,49 @@ Alternatively, The script [unit_test_inception_inference.sh](<https://github.com
```
./unit_test_inception_inference.sh
```

### [simple_rnn.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/simple_rnn.cpp>)
This example demonstrates sequence prediction workflow with pre-trained RNN model using MXNet C++ API. The purpose of this example is to demonstrate how a pre-trained RNN model can be loaded and used to generate an output sequence using C++ API.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably can be made more simple for making it easier to read and follow.
@aaronmarkham - Can you please help us here with the documentation? Thanks.

leleamol marked this conversation as resolved.
Show resolved Hide resolved
The example performs following tasks
leleamol marked this conversation as resolved.
Show resolved Hide resolved
- Load the pre-trained RNN model.
leleamol marked this conversation as resolved.
Show resolved Hide resolved
- Load the dictionary file that contains word to index mapping.
leleamol marked this conversation as resolved.
Show resolved Hide resolved
- Convert the input string to vector of indices and padded to match the input data length.
leleamol marked this conversation as resolved.
Show resolved Hide resolved
- Run the forward pass and predict the output string.
leleamol marked this conversation as resolved.
Show resolved Hide resolved

The example uses a pre-trained RNN model that is trained with the dataset containing speeches given by Obama.
leleamol marked this conversation as resolved.
Show resolved Hide resolved
The model consists of :
- Embedding Layer with the size of embedding to be 650
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A simple image will be helpful

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can get the pdf version of model (generated using visualiation API). Need some suggestions from @aaronmarkham to embed them in README.

leleamol marked this conversation as resolved.
Show resolved Hide resolved
- 3 LSTM Layers with hidden dimension size of 650 and sequence length of 35
- FullyConnected Layer
- SoftmaxOutput
The model was trained for 100 epochs.
The visual representation of the model is [here](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/obama-speaks.pdf>).
leleamol marked this conversation as resolved.
Show resolved Hide resolved

The model files can be found here.
- [obama-speaks-symbol.json](<https://s3.amazonaws.com/mxnet-cpp/RNN_model/obama-speaks-symbol.json>)
- [obama-speaks-0100.params](<https://s3.amazonaws.com/mxnet-cpp/RNN_model/obama-speaks-0100.params>)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why in mxnet-cpp bucket why not in mxnet pretrained models bucket?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current bucket mxnet-cpp is not a public by default but the contents are made publicly readable. This is similar to mxnet-scala bucket used for scala examples.

- [obama.dictionary.txt](<https://s3.amazonaws.com/mxnet-cpp/RNN_model/obama.dictionary.txt>) Each line of the dictionary file contains a word and a unique index for that word, separated by a space, with a total of 14293 words generated from the training dataset.
The example downloads the above files while running.

The example's command line parameters are as shown below:

```
./simple_rnn --help
Usage:
simple_rnn
[--input] Input string sequence.
[--gpu] Specify this option if workflow needs to be run in gpu context.

./simple_rnn

or

./simple_rnn --input "Good morning. I appreciate the opportunity to speak here"
```

The example will output the sequence of 35 words as follows:
```
[waters elected Amendment Amendment Amendment Amendment retirement maximize maximize maximize acr citi sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio sophisticatio ]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is the output bad? Can we give example of a well trained model?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to get better output by changing model hyperparameters but couldn't get it. It would require a good amount of input data processing as well. All these efforts would require dedicated time and out of scope for this example. The example aims towards loading the model and running forward pass. Improving on the model would be a separate task.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am working on implementing the RNN model using C++ API. I can work on improving the accuracy of that model and use it in this example later.

```

Alternatively, user can run [unit_test_simple_rnn.sh](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/unit_test_simple_rnn.sh>) script.
leleamol marked this conversation as resolved.
Show resolved Hide resolved
39 changes: 39 additions & 0 deletions cpp-package/example/inference/inference.mk
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

CPPEX_SRC = $(wildcard cpp-package/example/inference/*.cpp)
CPPEX_EXE = $(patsubst cpp-package/example/inference/%.cpp, build/cpp-package/example/%, $(CPPEX_SRC))

CPPEX_CFLAGS += -Icpp-package/include
CPPEX_EXTRA_LDFLAGS := -L$(ROOTDIR)/lib -lmxnet

EXTRA_PACKAGES += cpp-package-inference-example-all
EXTRA_PACKAGES_CLEAN += cpp-package-inference-example-clean

.PHONY: cpp-package-inference-example-all cpp-package-inference-example-clean

cpp-package-inference-example-all: cpp-package-all $(CPPEX_EXE)

build/cpp-package/example/% : cpp-package/example/inference/%.cpp lib/libmxnet.so $(CPP_PACKAGE_OP_H_FILE)
@mkdir -p $(@D)
$(CXX) -std=c++11 $(CFLAGS) $(CPPEX_CFLAGS) -MM -MT cpp-package/example/inference/$* $< >build/cpp-package/example/$*.d
$(CXX) -std=c++11 $(CFLAGS) $(CPPEX_CFLAGS) -o $@ $(filter %.cpp %.a, $^) $(LDFLAGS) $(CPPEX_EXTRA_LDFLAGS)

cpp-package-inference-example-clean:
rm -rf build/cpp-package/example/inference*

-include build/cpp-package/example/inference/*.d
Loading