Skip to content

Commit

Permalink
Improve clojure tutorial (apache#12974)
Browse files Browse the repository at this point in the history
* Switch tutorial to dependency/ies that exist on Maven

* Improve Clojure Module tutorial

* Add namespace docstring

* Bring verbiage up to date with https://mxnet.incubator.apache.org/api/clojure/module.html

* Add newlines for readability and to keep line length <80

* Nix duplicated section in Clojure Symbol API docs

"Multiple Outputs" is a (deprecated) repeat of "Group Multiple
Symbols".

* Improve Clojure Symbol tutorial

* Add namespace docstring

* Bring verbiage up to date with https://mxnet.incubator.apache.org/api/clojure/symbol.html

* Add newlines for readability and to keep line length <80

* Fix missing end-code-block in Clojure NDArray API docs

* Improve Clojure NDArray tutorial

* Add namespace docstring

* Bring verbiage up to date with https://mxnet.incubator.apache.org/api/clojure/ndarray.html

* Add newlines for readability and to keep line length <80

* Improve Clojure KVStore tutorial

* Add namespace docstring

* Bring verbiage up to date with https://mxnet.incubator.apache.org/api/clojure/kvstore.html

* Add newlines for readability and to keep line length <80

* [MXNET-1017] Updating the readme file for cpp-package and adding readme file for example directory. (apache#12773)

* Updating the readme file for cpp-package and adding readme file for example directory.

* Updating the readme file for cpp-package and adding readme file for example directory.

* Addressed the review comments.

* Addressed the review comments

* Fail the broken link job when broken links are found (apache#12905)

* Fix typo in formula in docstring for GRU cell and layer and add clarification to description (gluon.rnn) (apache#12896)

* Fix typo in GRU cell and layers (gluon.rnn) docstring

* empty

* fix the paths issue for downloading script (apache#12913)
  • Loading branch information
daveliepmann authored and Jose Luis Contreras committed Nov 13, 2018
1 parent 93844d2 commit 628e527
Show file tree
Hide file tree
Showing 7 changed files with 250 additions and 179 deletions.
5 changes: 4 additions & 1 deletion contrib/clojure-package/examples/tutorial/project.clj
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,7 @@
:description "MXNET tutorials"
:plugins [[lein-cljfmt "0.5.7"]]
:dependencies [[org.clojure/clojure "1.9.0"]
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]])
;; Uncomment the one appropriate for your machine & configuration:
#_[org.apache.mxnet.contrib.clojure/clojure-mxnet-linux-cpu "1.3.0"]
#_[org.apache.mxnet.contrib.clojure/clojure-mxnet-linux-gpu "1.3.0"]
#_[org.apache.mxnet.contrib.clojure/clojure-mxnet-osx-cpu "1.3.0"]])
60 changes: 40 additions & 20 deletions contrib/clojure-package/examples/tutorial/src/tutorial/kvstore.clj
Original file line number Diff line number Diff line change
Expand Up @@ -16,35 +16,44 @@
;;

(ns tutorial.kvstore
"A REPL tutorial of the MXNet Clojure API for KVStore, based on
https://mxnet.incubator.apache.org/api/clojure/kvstore.html"
(:require [org.apache.clojure-mxnet.kvstore :as kvstore]
[org.apache.clojure-mxnet.ndarray :as ndarray]
[org.apache.clojure-mxnet.context :as context]))

;;Basic Push and Pull
;;Provides basic operation over multiple devices (GPUs or CPUs) on a single device.

;; Initialization
;; Let’s consider a simple example. It initializes a (int, NDArray) pair into the store, and then pulls the value out.
;;;; Basic Push and Pull

(def kv (kvstore/create "local")) ;; create a local kvstore
;; Provides basic operation over multiple devices (GPUs or CPUs) on a
;; single device.

;;; Initialization
;; Let’s consider a simple example. It initializes a (`int`,
;; `NDArray`) pair into the store, and then pulls the value out.

(def kv (kvstore/create "local")) ; create a local kvstore
(def shape [2 3])
;;; init the kvstore with a vector of keys (strings) and ndarrays
;; init the kvstore with a vector of keys (strings) and ndarrays
(kvstore/init kv ["3"] [(ndarray/* (ndarray/ones shape) 2)])
(def a (ndarray/zeros shape))
(kvstore/pull kv ["3"] [a])
(ndarray/->vec a) ;=> [2.0 2.0 2.0 2.0 2.0 2.0]


;;Push, Aggregation, and Updater
;;For any key that’s been initialized, you can push a new value with the same shape to the key, as follows:

;;; Push, Aggregation, and Updater
;; For any key that’s been initialized, you can push a new value with
;; the same shape to the key, as follows:
(kvstore/push kv ["3"] [(ndarray/* (ndarray/ones shape) 8)])
(kvstore/pull kv ["3"] [a])
(ndarray/->vec a);=>[8.0 8.0 8.0 8.0 8.0 8.0]

;;The data that you want to push can be stored on any device. Furthermore, you can push multiple values into the same key, where KVStore first sums all of these values, and then pushes the aggregated value, as follows:
;; The data that you want to push can be stored on any
;; device. Furthermore, you can push multiple values into the same
;; key, where KVStore first sums all of these values, and then pushes
;; the aggregated value, as follows:

;; using multiple cpus instead of gpus
;; (Here we use multiple CPUs.)
(def cpus [(context/cpu 0) (context/cpu 1) (context/cpu 2)])
(def b [(ndarray/ones shape {:ctx (nth cpus 0)})
(ndarray/ones shape {:ctx (nth cpus 1)})
Expand All @@ -53,22 +62,33 @@
(kvstore/pull kv "3" a)
(ndarray/->vec a) ;=> [3.0 3.0 3.0 3.0 3.0 3.0]


;;Pull
;;You’ve already seen how to pull a single key-value pair. Similar to the way that you use the push command, you can pull the value into several devices with a single call.
;;; Pull
;; You’ve already seen how to pull a single key-value pair. Similar to
;; the way that you use the push command, you can pull the value into
;; several devices with a single call.
(def b [(ndarray/ones shape {:ctx (context/cpu 0)})
(ndarray/ones shape {:ctx (context/cpu 1)})])
(kvstore/pull kv ["3" "3"] b)
(map ndarray/->vec b) ;=> ([3.0 3.0 3.0 3.0 3.0 3.0] [3.0 3.0 3.0 3.0 3.0 3.0])

;;List Key-Value Pairs
;;All of the operations that we’ve discussed so far are performed on a single key. KVStore also provides the interface for generating a list of key-value pairs. For a single device, use the following:

;;;; List Key-Value Pairs

;; All of the operations that we’ve discussed so far are performed on
;; a single key. KVStore also provides the interface for generating a
;; list of key-value pairs. For a single device, use the following:

(def ks ["5" "7" "9"])
(kvstore/init kv ks [(ndarray/ones shape) (ndarray/ones shape) (ndarray/ones shape)])
(kvstore/push kv ks [(ndarray/ones shape) (ndarray/ones shape) (ndarray/ones shape)])
(def b [(ndarray/zeros shape) (ndarray/zeros shape) (ndarray/zeros shape)])
(kvstore/init kv ks [(ndarray/ones shape)
(ndarray/ones shape)
(ndarray/ones shape)])
(kvstore/push kv ks [(ndarray/ones shape)
(ndarray/ones shape)
(ndarray/ones shape)])
(def b [(ndarray/zeros shape)
(ndarray/zeros shape)
(ndarray/zeros shape)])
(kvstore/pull kv ks b)
(map ndarray/->vec b);=> ([1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0])
(map ndarray/->vec b) ;=> ([1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0])


136 changes: 96 additions & 40 deletions contrib/clojure-package/examples/tutorial/src/tutorial/module.clj
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@
;;

(ns tutorial.module
"A REPL tutorial of the MXNet Clojure API for Module, based on
https://mxnet.incubator.apache.org/api/clojure/module.html"
(:require [clojure.java.io :as io]
[clojure.java.shell :refer [sh]]
[org.apache.clojure-mxnet.eval-metric :as eval-metric]
Expand All @@ -24,12 +26,26 @@
[org.apache.clojure-mxnet.symbol :as sym]
[org.apache.clojure-mxnet.ndarray :as ndarray]))


;; The Module API provides an intermediate and high-level interface
;; for performing computation with neural networks in MXNet. Module
;; wraps a Symbol and one or more Executors. It has both a high level
;; and intermediate level API.


;;;; Prepare the Data

;; In this example, we are going to use the MNIST data set. If you
;; start, we can run some helper scripts to download the data for us.

(def data-dir "data/")

(when-not (.exists (io/file (str data-dir "train-images-idx3-ubyte")))
(sh "../../scripts/get_mnist_data.sh"))

;;; Load the MNIST datasets
;; MXNet provides function in the `io` namespace to load the MNIST
;; datasets into training and test data iterators that we can use with
;; our module.
(def train-data (mx-io/mnist-iter {:image (str data-dir "train-images-idx3-ubyte")
:label (str data-dir "train-labels-idx1-ubyte")
:label-name "softmax_label"
Expand All @@ -47,11 +63,13 @@
:flat true
:silent false}))

;; The module API provides an intermediate and high-level interface for performing computation with neural networks in MXNet. Module wraps a Symbol and one or more Executors. It has both a high level and intermediate level api

;; Preparing a module for Computation
;;;; Preparing a module for Computation

;; construct a module
;; To construct a module, we need to have a symbol as input. This
;; symbol takes input data in the first layer and then has subsequent
;; layers of fully connected and relu activation layers, ending up in
;; a softmax layer for output.

(let [data (sym/variable "data")
fc1 (sym/fully-connected "fc1" {:data data :num-hidden 128})
Expand All @@ -62,7 +80,7 @@
out (sym/softmax-output "softmax" {:data fc3})]
out) ;=>#object[org.apache.mxnet.Symbol 0x1f43a406 "org.apache.mxnet.Symbol@1f43a406"]

;; You can also use as-> for easier threading
;; You can also write this with the `as->` threading macro.


(def out (as-> (sym/variable "data") data
Expand All @@ -75,40 +93,62 @@
;=> #'tutorial.module/out


;; By default, context is the CPU. If you need data parallelization, you can specify a GPU context or an array of GPU contexts.
;; like this (m/module out {:contexts [(context/gpu)]})
;; By default, context is the CPU. If you need data parallelization,
;; you can specify a GPU context or an array of GPU contexts, like
;; this: `(m/module out {:contexts [(context/gpu)]})`

;; Before you can compute with a module, you need to call `bind` to allocate the device memory and `initParams` or `set-params` to initialize the parameters. If you simply want to fit a module, you don’t need to call `bind` and `init-params` explicitly, because the `fit` function automatically calls them if they are needed.
;; Before you can compute with a module, you need to call `bind` to
;; allocate the device memory and `initParams` or `set-params` to
;; initialize the parameters. If you simply want to fit a module, you
;; don’t need to call `bind` and `init-params` explicitly, because the
;; `fit` function automatically calls them if they are needed.

(let [mod (m/module out)]
(-> mod
(m/bind {:data-shapes (mx-io/provide-data train-data)
:label-shapes (mx-io/provide-label train-data)})
(m/init-params)))

;; Now you can compute with the module using functions like `forward`, `backward`, etc.
;; Now you can compute with the module using functions like `forward`,
;; `backward`, etc.


;; Training, Predicting, and Evaluating
;;;; Training and Predicting

;;Modules provide high-level APIs for training, predicting, and evaluating. To fit a module, call the `fit` function with some DataIters:
;; Modules provide high-level APIs for training, predicting, and
;; evaluating. To fit a module, call the `fit` function with some data
;; iterators:

(def mod (m/fit (m/module out) {:train-data train-data :eval-data test-data :num-epoch 1}))
(def mod
(m/fit (m/module out) {:train-data train-data
:eval-data test-data
:num-epoch 1}))
;; =>
;; Epoch 0 Train- [accuracy 0.12521666]
;; Epoch 0 Time cost- 8392
;; Epoch 0 Validation- [accuracy 0.2227]


;; You can pass in batch-end callbacks using batch-end-callback and epoch-end callbacks using epoch-end-callback in the `fit-params`. You can also set parameters using functions like in the fit-params like optimizer and eval-metric. To learn more about the fit-params, see the fit-param function options. To predict with a module, call `predict` with a DataIter:
;; You can pass in batch-end callbacks using batch-end-callback and
;; epoch-end callbacks using epoch-end-callback in the
;; `fit-params`. You can also set parameters using functions like in
;; the fit-params like optimizer and eval-metric. To learn more about
;; the fit-params, see the fit-param function options. To predict with
;; a module, call `predict` with a DataIter:

(def results
(m/predict mod {:eval-data test-data}))

(def results (m/predict mod {:eval-data test-data}))
(first results) ;=>#object[org.apache.mxnet.NDArray 0x3540b6d3 "org.apache.mxnet.NDArray@a48686ec"]

(first (ndarray/->vec (first results))) ;=>0.08261358

;;The module collects and returns all of the prediction results. For more details about the format of the return values, see the documentation for the `predict` function.
;; The module collects and returns all of the prediction results. For
;; more details about the format of the return values, see the
;; documentation for the `predict` function.

;;When prediction results might be too large to fit in memory, use the `predict-every-batch` API
;; When prediction results might be too large to fit in memory, use
;; the `predict-every-batch` API.

(let [preds (m/predict-every-batch mod {:eval-data test-data})]
(mx-io/reduce-batches test-data
Expand All @@ -118,23 +158,33 @@
;;; do something
(inc i))))

;;If you need to evaluate on a test set and don’t need the prediction output, call the `score` function with a DataIter and an EvalMetric:
;; If you need to evaluate on a test set and don’t need the prediction
;; output, call the `score` function with a data iterator and an eval
;; metric:

(m/score mod {:eval-data test-data :eval-metric (eval-metric/accuracy)}) ;=>["accuracy" 0.2227]
(m/score mod {:eval-data test-data
:eval-metric (eval-metric/accuracy)}) ;=>["accuracy" 0.2227]

;;This runs predictions on each batch in the provided DataIter and computes the evaluation score using the provided EvalMetric. The evaluation results are stored in metric so that you can query later.
;; This runs predictions on each batch in the provided DataIter and
;; computes the evaluation score using the provided EvalMetric. The
;; evaluation results are stored in metric so that you can query
;; later.

;;Saving and Loading Module Parameters

;;To save the module parameters in each training epoch, use a `checkpoint` function

;;;; Saving and Loading

;; To save the module parameters in each training epoch, use the
;; `save-checkpoint` function:

(let [save-prefix "my-model"]
(doseq [epoch-num (range 3)]
(mx-io/do-batches train-data (fn [batch
;; do something
]))
(m/save-checkpoint mod {:prefix save-prefix :epoch epoch-num :save-opt-states true})))
]))
(m/save-checkpoint mod {:prefix save-prefix
:epoch epoch-num
:save-opt-states true})))

;; INFO org.apache.mxnet.module.Module: Saved checkpoint to my-model-0000.params
;; INFO org.apache.mxnet.module.Module: Saved optimizer state to my-model-0000.states
Expand All @@ -144,20 +194,22 @@
;; INFO org.apache.mxnet.module.Module: Saved optimizer state to my-model-0002.states


;;To load the saved module parameters, call the `load-checkpoint` function:
;; To load the saved module parameters, call the `load-checkpoint`
;; function:

(def new-mod (m/load-checkpoint {:prefix "my-model" :epoch 1 :load-optimizer-states true}))

new-mod ;=> #object[org.apache.mxnet.module.Module 0x5304d0f4 "org.apache.mxnet.module.Module@5304d0f4"]

;;To initialize parameters, Bind the symbols to construct executors first with bind function. Then, initialize the parameters and auxiliary states by calling `init-params` function.

;; To initialize parameters, bind the symbols to construct executors
;; first with the `bind` function. Then, initialize the parameters and
;; auxiliary states by calling the `init-params` function.\
(-> new-mod
(m/bind {:data-shapes (mx-io/provide-data train-data) :label-shapes (mx-io/provide-label train-data)})
(m/bind {:data-shapes (mx-io/provide-data train-data)
:label-shapes (mx-io/provide-label train-data)})
(m/init-params))

;;To get current parameters, use `params`

;; To get current parameters, use `params`
(let [[arg-params aux-params] (m/params new-mod)]
{:arg-params arg-params
:aux-params aux-params})
Expand All @@ -178,20 +230,24 @@ new-mod ;=> #object[org.apache.mxnet.module.Module 0x5304d0f4 "org.apache.mxnet.
;; :aux-params {}}


;;To assign parameter and aux state values, use `set-params` function.
;; To assign parameter and aux state values, use the `set-params`
;; function:
(m/set-params new-mod {:arg-params (m/arg-params new-mod)
:aux-params (m/aux-params new-mod)})

(m/set-params new-mod {:arg-params (m/arg-params new-mod) :aux-params (m/aux-params new-mod)})
;=> #object[org.apache.mxnet.module.Module 0x5304d0f4 "org.apache.mxnet.module.Module@5304d0f4"]

;;To resume training from a saved checkpoint, instead of calling `set-params`, directly call `fit`, passing the loaded parameters, so that `fit` knows to start from those parameters instead of initializing randomly:
;; To resume training from a saved checkpoint, pass the loaded
;; parameters to the `fit` function. This will prevent `fit` from
;; initializing randomly.

;; reset the training data before calling fit or you will get an error
;; (First, reset the training data before calling `fit` or you will
;; get an error)
(mx-io/reset train-data)
(mx-io/reset test-data)

(m/fit new-mod {:train-data train-data :eval-data test-data :num-epoch 2
:fit-params (-> (m/fit-params {:begin-epoch 1}))})

;;Create fit-params, and then use it to set `begin-epoch` so that fit() knows to resume from a saved epoch.


;; Create `fit-params` and then use it to set `begin-epoch` so that
;; `fit` knows to resume from a saved epoch.
(m/fit new-mod {:train-data train-data
:eval-data test-data
:num-epoch 2
:fit-params (m/fit-params {:begin-epoch 1})})
Loading

0 comments on commit 628e527

Please sign in to comment.