Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[MXNET-1121] Example to demonstrate the inference workflow using RNN #13680

Merged
merged 35 commits into from
Feb 7, 2019
Merged
Show file tree
Hide file tree
Changes from 30 commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
9519a2d
[MXNET-1121] Example to demonstrate the inference workflow using RNN
leleamol Dec 18, 2018
8c06090
Merge branch 'master' of https://github.com/apache/incubator-mxnet in…
leleamol Jan 9, 2019
7a341f9
Addressed the review comments. Updated the ReadMe files.
leleamol Jan 9, 2019
c19760a
Removed the unnecessary creation of NDArray
leleamol Jan 9, 2019
438c3c4
Added the unit tests to nightly tests to catch the failure.
leleamol Jan 10, 2019
c2f7a67
Updated the makefiles and unit tests so that the examples are built a…
leleamol Jan 11, 2019
768fe5e
Added the visual representation of the model and fixed the CI failure.
leleamol Jan 11, 2019
d8abf83
Added the missing pdf file.
leleamol Jan 12, 2019
fd33d22
Fixing the broken ci_test.sh
leleamol Jan 12, 2019
e31a5bd
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
c53d329
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
9d315a2
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
c774f3c
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
8d88feb
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
6d631b3
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
0d00c74
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
03744a3
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
ff5fca3
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
4ffd4a9
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
bd6fad5
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
d5119c2
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
4bafe95
Update cpp-package/example/inference/README.md
IvyBazan Jan 12, 2019
1da9482
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
4393f18
Update cpp-package/example/inference/simple_rnn.cpp
IvyBazan Jan 12, 2019
45cbba9
Applying unresolved changes to README file.
leleamol Jan 12, 2019
bf48c42
Fixing the CI build failure.
leleamol Jan 14, 2019
bd35b40
Updated the RNN example from sequence generation to sentiment analysis
leleamol Jan 19, 2019
b198339
Updated the readme files. Updated the example to use trained model an…
leleamol Jan 21, 2019
4897901
Addressed the review comment to increase the default sequence length.…
leleamol Jan 23, 2019
bb14d79
Updated the example to handle variable length input. Updated the read…
leleamol Jan 28, 2019
a487ca9
Updated the example to share the memory between executors by createin…
leleamol Jan 30, 2019
c3cace1
Updated the creation of executors from largest to smallest bucket key
leleamol Feb 5, 2019
7d4a173
Creating the executor for the highest bucket key.
leleamol Feb 5, 2019
b1e074f
Updated the unit test to check for the results in a range and modifie…
leleamol Feb 6, 2019
0f155e9
Fixed the logic to find the right bucket.
leleamol Feb 6, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 4 additions & 5 deletions cpp-package/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ The users of these bindings are required to build this package as mentioned belo
The cpp-package directory contains the implementation of C++ API. As mentioned above, users are required to build this directory or package before using it.
**The cpp-package is built while building the MXNet shared library, *libmxnet.so*.**

###Steps to build the C++ package:
### Steps to build the C++ package:
1. Building the MXNet C++ package requires building MXNet from source.
2. Clone the MXNet GitHub repository **recursively** to ensure the code in submodules is available for building MXNet.
```
Expand All @@ -17,10 +17,10 @@ The cpp-package directory contains the implementation of C++ API. As mentioned a

3. Install the [prerequisites](<https://mxnet.incubator.apache.org/install/build_from_source#prerequisites>), desired [BLAS libraries](<https://mxnet.incubator.apache.org/install/build_from_source#blas-library>) and optional [OpenCV, CUDA, and cuDNN](<https://mxnet.incubator.apache.org/install/build_from_source#optional>) for building MXNet from source.
4. There is a configuration file for make, [make/config.mk](<https://github.com/apache/incubator-mxnet/blob/master/make/config.mk>) that contains all the compilation options. You can edit this file and set the appropriate options prior to running the **make** command.
5. Please refer to [platform specific build instructions](<https://mxnet.incubator.apache.org/install/build_from_source#build-instructions-by-operating-system>) and available [build configurations](https://mxnet.incubator.apache.org/install/build_from_source#build-configurations) for more details.
5. Please refer to [platform specific build instructions](<https://mxnet.incubator.apache.org/install/build_from_source#build-instructions-by-operating-system>) and available [build configurations](https://mxnet.incubator.apache.org/install/build_from_source#build-configurations) for more details.
5. For enabling the build of C++ Package, set the **USE\_CPP\_PACKAGE = 1** in [make/config.mk](<https://github.com/apache/incubator-mxnet/blob/master/make/config.mk>). Optionally, the compilation flag can also be specified on **make** command line as follows.
```
make -j USE_CPP_PACKAGE=1
make -j USE_CPP_PACKAGE=1
```

## Usage
Expand All @@ -42,5 +42,4 @@ A basic tutorial can be found at <https://mxnet.incubator.apache.org/tutorials/c

## Examples

The example directory contains examples for you to get started.

The example directory contains examples for you to get started. Please build the MXNet C++ Package before building the examples.
2 changes: 1 addition & 1 deletion cpp-package/cpp-package.mk
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,4 @@ cpp-package-lint:
(cd cpp-package; python scripts/lint.py dmlc ${LINT_LANG} include example)

include cpp-package/example/example.mk

include cpp-package/example/inference/inference.mk
3 changes: 2 additions & 1 deletion cpp-package/example/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@
## Building C++ examples

The examples in this folder demonstrate the **training** workflow. The **inference workflow** related examples can be found in [inference](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference>) folder.
The examples in this folder are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows
Please build the MXNet C++ Package as explained in the [README](<https://github.com/apache/incubator-mxnet/tree/master/cpp-package#building-c-package>) File before building these examples manually.
The examples in this folder are built while building the MXNet library and cpp-package from source. However, they can be built manually as follows

From cpp-package/examples directory

Expand Down
68 changes: 67 additions & 1 deletion cpp-package/example/inference/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Building C++ Inference examples

The examples in this folder demonstrate the **inference** workflow.
The examples in this folder demonstrate the **inference** workflow. Please build the MXNet C++ Package as explained in the [README](<https://github.com/apache/incubator-mxnet/tree/master/cpp-package#building-c-package>) File before building these examples.
To build examples use following commands:

- Release: **make all**
Expand Down Expand Up @@ -39,3 +39,69 @@ Alternatively, The script [unit_test_inception_inference.sh](<https://github.com
```
./unit_test_inception_inference.sh
```

### [sentiment_analysis_rnn.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/sentiment_analysis_rnn.cpp>)
This example demonstrates how you can load a pre-trained RNN model and use it to predict the sentiment expressed in the given movie review with the MXNet C++ API. The example is capable of processing variable legnth inputs. It performs the following tasks
- Loads the pre-trained RNN model.
- Loads the dictionary file containing the word to index mapping.
- Splits the review in multiple lines separated by "."
- The example predicts the sentiment score for individual lines and outputs the average score.

The example is capable of processing variable length input by implementing following technique:
- The example creates executors for pre-determined input lenghts such as 5, 10, 15, 20, 25, etc called **buckets**.
- Each bucket is identified by **bucket-key** representing the length on input required by corresponding executor.
- For each line in the review, the example finds the number of words in the line and tries to find a closest bucket or executor.
- If the bucket key does not match the number of words in the line, the example pads or trims the input line to match the required length.

The example uses a pre-trained RNN model trained with a IMDB dataset. The RNN model was built by exercising the [GluonNLP Sentiment Analysis Tutorial](<http://gluon-nlp.mxnet.io/examples/sentiment_analysis/sentiment_analysis.html#>). The tutorial uses 'standard_lstm_lm_200' available in Gluon Model Zoo and fine tunes it for the IMDB dataset
The model consists of :
- Embedding Layer
- 2 LSTM Layers with hidden dimension size of 200
- Average pooling layer
- Sigmoid output layer
The model was trained for 10 epochs to achieve 85% test accuracy.
The visual representation of the model is [here](<http://gluon-nlp.mxnet.io/examples/sentiment_analysis/sentiment_analysis.html#Sentiment-analysis-model-with-pre-trained-language-model-encoder>).

The model files can be found here.
- [sentiment_analysis-symbol.json](< https://s3.amazonaws.com/mxnet-cpp/RNN_model/sentiment_analysis-symbol.json>)
- [sentiment_analysis-0010.params](< https://s3.amazonaws.com/mxnet-cpp/RNN_model/sentiment_analysis-0010.params>)
- [sentiment_token_to_idx.txt](<https://s3.amazonaws.com/mxnet-cpp/RNN_model/sentiment_token_to_idx.txt>) Each line of the dictionary file contains a word and a unique index for that word, separated by a space, with a total of 32787 words generated from the training dataset.
The example downloads the above files while running.

The example's command line parameters are as shown below:

```
./sentiment_analysis_rnn --help
Usage:
sentiment_analysis_rnn
--input Input movie review. The review can be single line or multiline.e.g. "This movie is the best." OR "This movie is the best. The direction is awesome."
[--gpu] Specify this option if workflow needs to be run in gpu context
If the review is multiline, the example predicts sentiment score for each line and the final score is the average of scores obtained for each line.

```

The following command line shows running the example with the movie review containing only one line.

```
./sentiment_analysis_rnn --input "This movie has the great story"
```

The above command will output the sentiment score as follows:
```
sentiment_analysis_rnn.cpp:346: Input Line : [This movie has the great story] Score : 0.999898
sentiment_analysis_rnn.cpp:449: The sentiment score between 0 and 1, (1 being positive)=0.999898
```

The following command line shows invoking the example with the multi-line review.

```
./sentiment_analysis_rnn --input "This movie is the best. The direction is awesome."
```
The above command will output the sentiment score for each line in the review and average score as follows:
```
Input Line : [This movie is the best] Score : 0.964498
Input Line : [ The direction is awesome] Score : 0.968855
The sentiment score between 0 and 1, (1 being positive)=0.966677
```

Alternatively, you can run the [unit_test_sentiment_analysis_rnn.sh](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/unit_test_sentiment_analysis_rnn.sh>) script.
39 changes: 39 additions & 0 deletions cpp-package/example/inference/inference.mk
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

CPPEX_SRC = $(wildcard cpp-package/example/inference/*.cpp)
CPPEX_EXE = $(patsubst cpp-package/example/inference/%.cpp, build/cpp-package/example/%, $(CPPEX_SRC))

CPPEX_CFLAGS += -Icpp-package/include
CPPEX_EXTRA_LDFLAGS := -L$(ROOTDIR)/lib -lmxnet

EXTRA_PACKAGES += cpp-package-inference-example-all
EXTRA_PACKAGES_CLEAN += cpp-package-inference-example-clean

.PHONY: cpp-package-inference-example-all cpp-package-inference-example-clean

cpp-package-inference-example-all: cpp-package-all $(CPPEX_EXE)

build/cpp-package/example/% : cpp-package/example/inference/%.cpp lib/libmxnet.so $(CPP_PACKAGE_OP_H_FILE)
@mkdir -p $(@D)
$(CXX) -std=c++11 $(CFLAGS) $(CPPEX_CFLAGS) -MM -MT cpp-package/example/inference/$* $< >build/cpp-package/example/$*.d
$(CXX) -std=c++11 $(CFLAGS) $(CPPEX_CFLAGS) -o $@ $(filter %.cpp %.a, $^) $(LDFLAGS) $(CPPEX_EXTRA_LDFLAGS)

cpp-package-inference-example-clean:
rm -rf build/cpp-package/example/inference*

-include build/cpp-package/example/inference/*.d
Loading