Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[MXNET-1083] Add the example to demonstrate the inference workflow using C++ API #13294

Merged
merged 6 commits into from
Dec 15, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 4 additions & 3 deletions cpp-package/example/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@

## Building C++ examples

The examples are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows
The examples in this folder demonstrate the **training** workflow. The **inference workflow** related examples can be found in [inference](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference>) folder.
leleamol marked this conversation as resolved.
Show resolved Hide resolved
leleamol marked this conversation as resolved.
Show resolved Hide resolved
The examples in this folder are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows
leleamol marked this conversation as resolved.
Show resolved Hide resolved

From cpp-package/examples directory

Expand All @@ -18,7 +19,7 @@ The examples that are built to be run on GPU may not work on the non-GPU machine
The makefile will also download the necessary data files and store in a data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.)


## Examples
## Examples demonstrating training workflow

This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable viz. **LD\_LIBRARY\_PATH** for Linux, Mac and Ubuntu OS and **PATH** for Windows OS.

Expand Down Expand Up @@ -97,7 +98,7 @@ build/lenet_with_mxdataiter 10

In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist data and run `lenet_with_mxdataiter` example.

###[inception_bn.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inception_bn.cpp>)
### [inception_bn.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inception_bn.cpp>)

The code implements an Inception network using the C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. The example can be run by executing the following command:

Expand Down
40 changes: 40 additions & 0 deletions cpp-package/example/inference/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.


CPPEX_SRC = $(wildcard *.cpp)
CPPEX_EXE = $(patsubst %.cpp, %, $(CPPEX_SRC))
OPENCV_CFLAGS=`pkg-config --cflags opencv`
OPENCV_LDFLAGS=`pkg-config --libs opencv`

CXX=g++


CFLAGS=$(COMMFLAGS) -I../../../3rdparty/tvm/nnvm/include -I../../../3rdparty/dmlc-core/include -I ../../include -I ../../../include -Wall -O3 -msse3 -funroll-loops -Wno-unused-parameter -Wno-unknown-pragmas
CPPEX_EXTRA_LDFLAGS := -L../../../lib -lmxnet $(OPENCV_LDFLAGS)

all: $(CPPEX_EXE)

debug: CPPEX_CFLAGS += -DDEBUG -g
debug: all


$(CPPEX_EXE):% : %.cpp
$(CXX) -std=c++0x $(CFLAGS) $(CPPEX_CFLAGS) -o $@ $(filter %.cpp %.a, $^) $(CPPEX_EXTRA_LDFLAGS)

clean:
rm -f $(CPPEX_EXE)
41 changes: 41 additions & 0 deletions cpp-package/example/inference/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# MXNet C++ Package Inference Workflow Examples

## Building C++ Inference examples

The examples in this folder demonstrate the **inference** workflow.
To build examples use following commands:

- Release: **make all**
- Debug: **make debug all**


## Examples demonstrating inference workflow

This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable viz. **LD\_LIBRARY\_PATH** for Linux, Mac and Ubuntu OS and **PATH** for Windows OS.
leleamol marked this conversation as resolved.
Show resolved Hide resolved

### [inception_inference.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/inception_inference.cpp>)

This example demonstrates image classification workflow with pre-trained models using MXNet C++ API. The command line parameters the example can accept are as shown below:

```
./inception_inference --help
Usage:
inception_inference --symbol <model symbol file in json format>
--params <model params file>
--image <path to the image used for prediction
--synset file containing labels for prediction
[--input_shape <dimensions of input image e.g "3 224 224"]
[--mean file containing mean image for normalizing the input image
[--gpu] Specify this option if workflow needs to be run in gpu context
```
The model json and param file and synset files are required to run this example. The sample command line is as follows:

```

./inception_inference --symbol "./model/Inception-BN-symbol.json" --params "./model/Inception-BN-0126.params" --synset "./model/synset.txt" --mean "./model/mean_224.nd" --image "./model/dog.jpg"
```
Alternatively, The script [unit_test_inception_inference.sh](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/unit_test_inception_inference.sh>) downloads the pre-trained **Inception** model and a test image. The users can invoke this script as follows:

```
./unit_test_inception_inference.sh
```
Loading