From 25dea8f5528be7f3f74ca2c2fa42f528f6cdd4eb Mon Sep 17 00:00:00 2001
From: kvartet <48014605+kvartet@users.noreply.github.com>
Date: Mon, 26 Apr 2021 00:15:25 +0800
Subject: [PATCH] Add release note for v2.2 (#3539)
---
README.md | 2 +-
docs/en_US/Release.rst | 60 ++++++++++++++++++++++++++++++-
docs/en_US/Tuner/BuiltinTuner.rst | 2 +-
docs/en_US/Tuner/PPOTuner.rst | 6 ++--
4 files changed, 64 insertions(+), 6 deletions(-)
diff --git a/README.md b/README.md
index e08512cefe..4fdf4fe914 100644
--- a/README.md
+++ b/README.md
@@ -27,7 +27,7 @@ The tool manages automated machine learning (AutoML) experiments, **dispatches a
## **What's NEW!**
* **New release**: [v2.1 is available](https://github.com/microsoft/nni/releases) - _released on Mar-10-2021_
-* **New demo available**: [Youtube entry](https://www.youtube.com/channel/UCKcafm6861B2mnYhPbZHavw) | [Bilibili 入口](https://space.bilibili.com/1649051673) - _last updated on Feb-19-2021_
+* **New demo available**: [Youtube entry](https://www.youtube.com/channel/UCKcafm6861B2mnYhPbZHavw) | [Bilibili 入口](https://space.bilibili.com/1649051673) - _last updated on Apr-21-2021_
* **New use case sharing**: [Cost-effective Hyper-parameter Tuning using AdaptDL with NNI](https://medium.com/casl-project/cost-effective-hyper-parameter-tuning-using-adaptdl-with-nni-e55642888761) - _posted on Feb-23-2021_
diff --git a/docs/en_US/Release.rst b/docs/en_US/Release.rst
index 3432e22f62..419034ed5e 100644
--- a/docs/en_US/Release.rst
+++ b/docs/en_US/Release.rst
@@ -5,6 +5,64 @@
Change Log
==========
+Release 2.2 - 4/22/2021
+-----------------------
+
+Major updates
+^^^^^^^^^^^^^
+
+Neural Architecture Search
+""""""""""""""""""""""""""
+
+* Improve NAS 2.0 (Retiarii) Framework (Alpha Release)
+
+ * Support local debug mode (#3476)
+ * Support nesting ``ValueChoice`` in ``LayerChoice`` (#3508)
+ * Support dict/list type in ``ValueChoice`` (#3508)
+ * Improve the format of export architectures (#3464)
+ * Refactor of NAS examples (#3513)
+ * Refer to `here `__ for Retiarii Roadmap
+
+Model Compression
+"""""""""""""""""
+
+* Support speedup for mixed precision quantization model (Experimental) (#3488 #3512)
+* Support model export for quantization algorithm (#3458 #3473)
+* Support model export in model compression for TensorFlow (#3487)
+* Improve documentation (#3482)
+
+nnictl & nni.experiment
+"""""""""""""""""""""""
+
+* Add native support for experiment config V2 (#3466 #3540 #3552)
+* Add resume and view mode in Python API ``nni.experiment`` (#3490 #3524 #3545)
+
+Training Service
+""""""""""""""""
+
+* Support umount for shared storage in remote training service (#3456)
+* Support Windows as the remote training service in reuse mode (#3500)
+* Remove duplicated env folder in remote training service (#3472)
+* Add log information for GPU metric collector (#3506)
+
+WebUI
+"""""
+
+* Support launching TensorBoard on WebUI (#3454 #3361 #3531)
+* Upgrade echarts-for-react to v5 (#3457)
+* Add wrap for dispatcher/nnimanager log monaco editor (#3461)
+
+Bug Fixes
+^^^^^^^^^
+
+* Fix bug of FLOPs counter (#3497)
+* Fix bug of hyper-parameter Add/Remove axes and table Add/Remove columns button conflict (#3491)
+* Fix bug that monaco editor search text is not displayed completely (#3492)
+* Fix bug of Cream NAS (#3498, thanks the external contributor @AliCloud-PAI)
+* Fix typos in docs (#3448, thanks the external contributor @OliverShang)
+* Fix typo in NAS 1.0 (#3538, thanks the external contributor @ankitaggarwal23)
+
+
Release 2.1 - 3/10/2021
-----------------------
@@ -289,7 +347,7 @@ Documentation
* Fix several typos and grammar mistakes in documentation (#2637 #2638, thanks @tomzx)
* Improve AzureML training service documentation (#2631)
* Improve CI of Chinese translation (#2654)
-* Improve OpenPAI training service documenation (#2685)
+* Improve OpenPAI training service documentation (#2685)
* Improve documentation of community sharing (#2640)
* Add tutorial of Colab support (#2700)
* Improve documentation structure for model compression (#2676)
diff --git a/docs/en_US/Tuner/BuiltinTuner.rst b/docs/en_US/Tuner/BuiltinTuner.rst
index 9d5f0fbc4f..768913b373 100644
--- a/docs/en_US/Tuner/BuiltinTuner.rst
+++ b/docs/en_US/Tuner/BuiltinTuner.rst
@@ -512,7 +512,7 @@ Note that the only acceptable types within the search space are ``layer_choice``
**Suggested scenario**
-PPOTuner is a Reinforcement Learning tuner based on the PPO algorithm. PPOTuner can be used when using the NNI NAS interface to do neural architecture search. In general, the Reinforcement Learning algorithm needs more computing resources, though the PPO algorithm is relatively more efficient than others. It's recommended to use this tuner when you have a large amount of computional resources available. You could try it on a very simple task, such as the :githublink:`mnist-nas ` example. `See details <./PPOTuner.rst>`__
+PPOTuner is a Reinforcement Learning tuner based on the PPO algorithm. PPOTuner can be used when using the NNI NAS interface to do neural architecture search. In general, the Reinforcement Learning algorithm needs more computing resources, though the PPO algorithm is relatively more efficient than others. It's recommended to use this tuner when you have a large amount of computional resources available. You could try it on a very simple task, such as the :githublink:`mnist-nas ` example. `See details <./PPOTuner.rst>`__
**classArgs Requirements:**
diff --git a/docs/en_US/Tuner/PPOTuner.rst b/docs/en_US/Tuner/PPOTuner.rst
index 5ad6414c92..ce769e1d3b 100644
--- a/docs/en_US/Tuner/PPOTuner.rst
+++ b/docs/en_US/Tuner/PPOTuner.rst
@@ -7,15 +7,15 @@ PPOTuner
This is a tuner geared for NNI's Neural Architecture Search (NAS) interface. It uses the `ppo algorithm `__. The implementation inherits the main logic of the ppo2 OpenAI implementation `here `__ and is adapted for the NAS scenario.
We had successfully tuned the mnist-nas example and has the following result:
-**NOTE: we are refactoring this example to the latest NAS interface, will publish the example codes after the refactor.**
+.. Note:: we are refactoring this example to the latest NAS interface, will publish the example codes after the refactor.
.. image:: ../../img/ppo_mnist.png
:target: ../../img/ppo_mnist.png
:alt:
-We also tune :githublink:`the macro search space for image classification in the enas paper ` (with a limited epoch number for each trial, i.e., 8 epochs), which is implemented using the NAS interface and tuned with PPOTuner. Here is Figure 7 from the `enas paper `__ to show what the search space looks like
+We also tune :githublink:`the macro search space for image classification in the enas paper ` (with a limited epoch number for each trial, i.e., 8 epochs), which is implemented using the NAS interface and tuned with PPOTuner. Here is Figure 7 from the `enas paper `__ to show what the search space looks like
.. image:: ../../img/enas_search_space.png
@@ -25,7 +25,7 @@ We also tune :githublink:`the macro search space for image classification in the
The figure above was the chosen architecture. Each square is a layer whose operation was chosen from 6 options. Each dashed line is a skip connection, each square layer can choose 0 or 1 skip connections, getting the output from a previous layer. **Note that**\ , in original macro search space, each square layer could choose any number of skip connections, while in our implementation, it is only allowed to choose 0 or 1.
-The results are shown in figure below (see the experimenal config :githublink:`here `\ :
+The results are shown in figure below (see the experimenal config :githublink:`here `\ :
.. image:: ../../img/ppo_cifar10.png