Skip to content

Commit ee71f16

Browse files
authored
Merge pull request #280 from microsoft/master
merge master
2 parents cb9efcc + dec91f7 commit ee71f16

File tree

89 files changed

+3513
-2566
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

89 files changed

+3513
-2566
lines changed

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
-----------
66

77
[![MIT licensed](https://img.shields.io/badge/license-MIT-brightgreen.svg)](LICENSE)
8-
[![Build Status](https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration-test-local?branchName=master)](https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=17&branchName=master)
8+
[![Build Status](https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/full%20test%20-%20linux?branchName=master)](https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=62&branchName=master)
99
[![Issues](https://img.shields.io/github/issues-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen)
1010
[![Bugs](https://img.shields.io/github/issues/Microsoft/nni/bug.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3Abug)
1111
[![Pull Requests](https://img.shields.io/github/issues-pr-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/pulls?q=is%3Apr+is%3Aopen)

azure-pipelines.yml

+2-4
Original file line numberDiff line numberDiff line change
@@ -34,8 +34,7 @@ jobs:
3434
python3 -m pip install gym onnx peewee thop --user
3535
python3 -m pip install sphinx==1.8.3 sphinx-argparse==0.2.5 sphinx-markdown-tables==0.0.9 sphinx-rtd-theme==0.4.2 sphinxcontrib-websupport==1.1.0 recommonmark==0.5.0 nbsphinx --user
3636
sudo apt-get install swig -y
37-
nnictl package install --name=SMAC
38-
nnictl package install --name=BOHB
37+
python3 -m pip install -e .[SMAC,BOHB]
3938
displayName: 'Install dependencies'
4039
- script: |
4140
cd test
@@ -73,8 +72,7 @@ jobs:
7372
python3 -m pip install keras==2.1.6 --user
7473
python3 -m pip install gym onnx peewee --user
7574
sudo apt-get install swig -y
76-
nnictl package install --name=SMAC
77-
nnictl package install --name=BOHB
75+
python3 -m pip install -e .[SMAC,BOHB]
7876
displayName: 'Install dependencies'
7977
- script: |
8078
set -e

deployment/registered_algorithms.yml

+78
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,78 @@
1+
advisors:
2+
- builtinName: Hyperband
3+
classArgsValidator: nni.algorithms.hpo.hyperband_advisor.hyperband_advisor.HyperbandClassArgsValidator
4+
className: nni.algorithms.hpo.hyperband_advisor.hyperband_advisor.Hyperband
5+
source: nni
6+
- builtinName: BOHB
7+
classArgsValidator: nni.algorithms.hpo.bohb_advisor.bohb_advisor.BOHBClassArgsValidator
8+
className: nni.algorithms.hpo.bohb_advisor.bohb_advisor.BOHB
9+
source: nni
10+
assessors:
11+
- builtinName: Medianstop
12+
classArgsValidator: nni.algorithms.hpo.medianstop_assessor.medianstop_assessor.MedianstopClassArgsValidator
13+
className: nni.algorithms.hpo.medianstop_assessor.medianstop_assessor.MedianstopAssessor
14+
source: nni
15+
- builtinName: Curvefitting
16+
classArgsValidator: nni.algorithms.hpo.curvefitting_assessor.curvefitting_assessor.CurvefittingClassArgsValidator
17+
className: nni.algorithms.hpo.curvefitting_assessor.curvefitting_assessor.CurvefittingAssessor
18+
source: nni
19+
tuners:
20+
- builtinName: PPOTuner
21+
classArgsValidator: nni.algorithms.hpo.ppo_tuner.ppo_tuner.PPOClassArgsValidator
22+
className: nni.algorithms.hpo.ppo_tuner.ppo_tuner.PPOTuner
23+
source: nni
24+
- builtinName: SMAC
25+
classArgsValidator: nni.algorithms.hpo.smac_tuner.smac_tuner.SMACClassArgsValidator
26+
className: nni.algorithms.hpo.smac_tuner.smac_tuner.SMACTuner
27+
source: nni
28+
- builtinName: TPE
29+
classArgs:
30+
algorithm_name: tpe
31+
classArgsValidator: nni.algorithms.hpo.hyperopt_tuner.hyperopt_tuner.HyperoptClassArgsValidator
32+
className: nni.algorithms.hpo.hyperopt_tuner.hyperopt_tuner.HyperoptTuner
33+
source: nni
34+
- acceptClassArgs: false
35+
builtinName: Random
36+
classArgs:
37+
algorithm_name: random_search
38+
classArgsValidator: nni.algorithms.hpo.hyperopt_tuner.hyperopt_tuner.HyperoptClassArgsValidator
39+
className: nni.algorithms.hpo.hyperopt_tuner.hyperopt_tuner.HyperoptTuner
40+
source: nni
41+
- builtinName: Anneal
42+
classArgs:
43+
algorithm_name: anneal
44+
classArgsValidator: nni.algorithms.hpo.hyperopt_tuner.hyperopt_tuner.HyperoptClassArgsValidator
45+
className: nni.algorithms.hpo.hyperopt_tuner.hyperopt_tuner.HyperoptTuner
46+
source: nni
47+
- builtinName: Evolution
48+
classArgsValidator: nni.algorithms.hpo.evolution_tuner.evolution_tuner.EvolutionClassArgsValidator
49+
className: nni.algorithms.hpo.evolution_tuner.evolution_tuner.EvolutionTuner
50+
source: nni
51+
- acceptClassArgs: false
52+
builtinName: BatchTuner
53+
className: nni.algorithms.hpo.batch_tuner.batch_tuner.BatchTuner
54+
source: nni
55+
- acceptClassArgs: false
56+
builtinName: GridSearch
57+
className: nni.algorithms.hpo.gridsearch_tuner.gridsearch_tuner.GridSearchTuner
58+
source: nni
59+
- builtinName: NetworkMorphism
60+
classArgsValidator: nni.algorithms.hpo.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismClassArgsValidator
61+
className: nni.algorithms.hpo.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismTuner
62+
source: nni
63+
- builtinName: MetisTuner
64+
classArgsValidator: nni.algorithms.hpo.metis_tuner.metis_tuner.MetisClassArgsValidator
65+
className: nni.algorithms.hpo.metis_tuner.metis_tuner.MetisTuner
66+
source: nni
67+
- builtinName: GPTuner
68+
classArgsValidator: nni.algorithms.hpo.gp_tuner.gp_tuner.GPClassArgsValidator
69+
className: nni.algorithms.hpo.gp_tuner.gp_tuner.GPTuner
70+
source: nni
71+
- builtinName: PBTTuner
72+
classArgsValidator: nni.algorithms.hpo.pbt_tuner.pbt_tuner.PBTClassArgsValidator
73+
className: nni.algorithms.hpo.pbt_tuner.pbt_tuner.PBTTuner
74+
source: nni
75+
- builtinName: RegularizedEvolutionTuner
76+
classArgsValidator: nni.algorithms.hpo.regularized_evolution_tuner.regularized_evolution_tuner.EvolutionClassArgsValidator
77+
className: nni.algorithms.hpo.regularized_evolution_tuner.regularized_evolution_tuner.RegularizedEvolutionTuner
78+
source: nni

docs/archive_en_US/NAS/Benchmarks.md

+24-3
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
11
# NAS Benchmarks
22

3+
[TOC]
4+
35
```eval_rst
46
.. toctree::
57
:hidden:
@@ -9,7 +11,7 @@
911

1012
## Introduction
1113

12-
To imporve the reproducibility of NAS algorithms as well as reducing computing resource requirements, researchers proposed a series of NAS benchmarks such as [NAS-Bench-101](https://arxiv.org/abs/1902.09635), [NAS-Bench-201](https://arxiv.org/abs/2001.00326), [NDS](https://arxiv.org/abs/1905.13214), etc. NNI provides a query interface for users to acquire these benchmarks. Within just a few lines of code, researcher are able to evaluate their NAS algorithms easily and fairly by utilizing these benchmarks.
14+
To imporve the reproducibility of NAS algorithms as well as reducing computing resource requirements, researchers proposed a series of NAS benchmarks such as [NAS-Bench-101](https://arxiv.org/abs/1902.09635), [NAS-Bench-201](https://arxiv.org/abs/2001.00326), [NDS](https://arxiv.org/abs/1905.13214), [NLP](https://arxiv.org/abs/2006.07116), etc. NNI provides a query interface for users to acquire these benchmarks. Within just a few lines of code, researcher are able to evaluate their NAS algorithms easily and fairly by utilizing these benchmarks.
1315

1416
## Prerequisites
1517

@@ -27,7 +29,7 @@ cd nni/examples/nas/benchmarks
2729
```
2830
Replace `${NNI_VERSION}` with a released version name or branch name, e.g., `v1.9`.
2931

30-
2. Install dependencies via `pip3 install -r xxx.requirements.txt`. `xxx` can be `nasbench101`, `nasbench201` or `nds`.
32+
2. Install dependencies via `pip3 install -r xxx.requirements.txt`. `xxx` can be `nasbench101`, `nasbench201`, `nds` or `nlp`.
3133
3. Generate the database via `./xxx.sh`. The directory that stores the benchmark file can be configured with `NASBENCHMARK_DIR` environment variable, which defaults to `~/.nni/nasbenchmark`. Note that the NAS-Bench-201 dataset will be downloaded from a google drive.
3234

3335
Please make sure there is at least 10GB free disk space and note that the conversion process can take up to hours to complete.
@@ -109,7 +111,7 @@ _On Network Design Spaces for Visual Recognition_ released trial statistics of o
109111

110112
Instead of storing results obtained with different configurations in separate files, we dump them into one single database to enable comparison in multiple dimensions. Specifically, we use `model_family` to distinguish model types, `model_spec` for all hyper-parameters needed to build this model, `cell_spec` for detailed information on operators and connections if it is a NAS cell, `generator` to denote the sampling policy through which this configuration is generated. Refer to API documentation for details.
111113

112-
## Available Operators
114+
### Available Operators
113115

114116
Here is a list of available operators used in NDS.
115117

@@ -158,3 +160,22 @@ Here is a list of available operators used in NDS.
158160
159161
.. autoclass:: nni.nas.benchmarks.nds.NdsIntermediateStats
160162
```
163+
164+
## NLP
165+
166+
[Paper link](https://arxiv.org/abs/2006.07116)     [Open-source](https://github.com/fmsnew/nas-bench-nlp-release)
167+
168+
The paper "NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language Processing" have provided search space of recurrent neural networks on the text datasets and trained 14k architectures within it, and have conducted both intrinsic and extrinsic evaluation of the trained models using datasets for semantic relatedness and language understanding evaluation. There are 2 datasets - PTB and wikitext-2. In the end, the precomputed results(ptb_single_run + ptb_multi_run + wikitext-2) can be utilized.
169+
170+
### API Documentation
171+
172+
```eval_rst
173+
.. autofunction:: nni.nas.benchmarks.nlp.query_nlp_trial_stats
174+
175+
.. autoclass:: nni.nas.benchmarks.nlp.NlpTrialConfig
176+
177+
.. autoclass:: nni.nas.benchmarks.nlp.NlpTrialStats
178+
179+
.. autoclass:: nni.nas.benchmarks.nlp.NlpIntermediateStats
180+
```
181+

docs/archive_en_US/TrainingService/AdaptDLMode.md

+18-13
Original file line numberDiff line numberDiff line change
@@ -52,6 +52,7 @@ trialConcurrency: 2
5252
maxTrialNum: 2
5353

5454
trial:
55+
namespace: <k8s_namespace>
5556
adaptive: false # optional.
5657
image: <image_tag>
5758
imagePullSecrets: # optional
@@ -66,7 +67,7 @@ trial:
6667
path: /
6768
containerMountPath: /nfs
6869
checkpoint: # optional
69-
storageClass: microk8s-hostpath
70+
storageClass: dfs
7071
storageSize: 1Gi
7172
```
7273
@@ -79,18 +80,22 @@ IP address of the machine with NNI manager (NNICTL) that launches NNI experiment
7980
* **logCollection**: *Recommended* to set as `http`. It will collect the trial logs on cluster back to your machine via http.
8081
* **tuner**: It supports the Tuun tuner and all NNI built-in tuners (only except for the checkpoint feature of the NNI PBT tuners).
8182
* **trial**: It defines the specs of an `adl` trial.
82-
* **adaptive**: (*Optional*) Boolean for AdaptDL trainer. While `true`, it the job is preemptible and adaptive.
83-
* **image**: Docker image for the trial
84-
* **imagePullSecret**: (*Optional*) If you are using a private registry,
85-
you need to provide the secret to successfully pull the image.
86-
* **codeDir**: the working directory of the container. `.` means the default working directory defined by the image.
87-
* **command**: the bash command to start the trial
88-
* **gpuNum**: the number of GPUs requested for this trial. It must be non-negative integer.
89-
* **cpuNum**: (*Optional*) the number of CPUs requested for this trial. It must be non-negative integer.
90-
* **memorySize**: (*Optional*) the size of memory requested for this trial. It must follow the Kubernetes
91-
[default format](https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/#meaning-of-memory).
92-
* **nfs**: (*Optional*) mounting external storage. For more information about using NFS please check the below paragraph.
93-
* **checkpoint** (*Optional*) [storage settings](https://kubernetes.io/docs/concepts/storage/storage-classes/) for AdaptDL internal checkpoints. You can keep it optional if you are not dev users.
83+
* **namespace**: (*Optional*) Kubernetes namespace to launch the trials. Default to `default` namespace.
84+
* **adaptive**: (*Optional*) Boolean for AdaptDL trainer. While `true`, it the job is preemptible and adaptive.
85+
* **image**: Docker image for the trial
86+
* **imagePullSecret**: (*Optional*) If you are using a private registry,
87+
you need to provide the secret to successfully pull the image.
88+
* **codeDir**: the working directory of the container. `.` means the default working directory defined by the image.
89+
* **command**: the bash command to start the trial
90+
* **gpuNum**: the number of GPUs requested for this trial. It must be non-negative integer.
91+
* **cpuNum**: (*Optional*) the number of CPUs requested for this trial. It must be non-negative integer.
92+
* **memorySize**: (*Optional*) the size of memory requested for this trial. It must follow the Kubernetes
93+
[default format](https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/#meaning-of-memory).
94+
* **nfs**: (*Optional*) mounting external storage. For more information about using NFS please check the below paragraph.
95+
* **checkpoint**: (*Optional*) storage settings for model checkpoints.
96+
* **storageClass**: check [Kubernetes storage documentation](https://kubernetes.io/docs/concepts/storage/storage-classes/) for how to use the appropriate `storageClass`.
97+
* **storageSize**: this value should be large enough to fit your model's checkpoints, or it could cause disk quota exceeded error.
98+
9499

95100
### NFS Storage
96101

docs/archive_en_US/Tuner/BohbAdvisor.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ The sampling procedure (using Multidimensional KDE to guide selection) is summar
4646
BOHB advisor requires the [ConfigSpace](https://github.com/automl/ConfigSpace) package. ConfigSpace can be installed using the following command.
4747

4848
```bash
49-
nnictl package install --name=BOHB
49+
pip install nni[BOHB]
5050
```
5151

5252
To use BOHB, you should add the following spec in your experiment's YAML config file:

docs/archive_en_US/Tuner/BuiltinTuner.md

+7-5
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ Currently, we support the following algorithms:
1212
|[__Random Search__](#Random)|In Random Search for Hyper-Parameter Optimization show that Random Search might be surprisingly simple and effective. We suggest that we could use Random Search as the baseline when we have no knowledge about the prior distribution of hyper-parameters. [Reference Paper](http://www.jmlr.org/papers/volume13/bergstra12a/bergstra12a.pdf)|
1313
|[__Anneal__](#Anneal)|This simple annealing algorithm begins by sampling from the prior, but tends over time to sample from points closer and closer to the best ones observed. This algorithm is a simple variation on the random search that leverages smoothness in the response surface. The annealing rate is not adaptive.|
1414
|[__Naïve Evolution__](#Evolution)|Naïve Evolution comes from Large-Scale Evolution of Image Classifiers. It randomly initializes a population-based on search space. For each generation, it chooses better ones and does some mutation (e.g., change a hyperparameter, add/remove one layer) on them to get the next generation. Naïve Evolution requires many trials to work, but it's very simple and easy to expand new features. [Reference paper](https://arxiv.org/pdf/1703.01041.pdf)|
15-
|[__SMAC__](#SMAC)|SMAC is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by NNI is a wrapper on the SMAC3 GitHub repo. Notice, SMAC needs to be installed by `nnictl package` command. [Reference Paper,](https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf) [GitHub Repo](https://github.com/automl/SMAC3)|
15+
|[__SMAC__](#SMAC)|SMAC is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by NNI is a wrapper on the SMAC3 GitHub repo. Notice, SMAC needs to be installed by `pip install nni[SMAC]` command. [Reference Paper,](https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf) [GitHub Repo](https://github.com/automl/SMAC3)|
1616
|[__Batch tuner__](#Batch)|Batch tuner allows users to simply provide several configurations (i.e., choices of hyper-parameters) for their trial code. After finishing all the configurations, the experiment is done. Batch tuner only supports the type choice in search space spec.|
1717
|[__Grid Search__](#GridSearch)|Grid Search performs an exhaustive searching through a manually specified subset of the hyperparameter space defined in the searchspace file. Note that the only acceptable types of search space are choice, quniform, randint. |
1818
|[__Hyperband__](#Hyperband)|Hyperband tries to use limited resources to explore as many configurations as possible and returns the most promising ones as a final result. The basic idea is to generate many configurations and run them for a small number of trials. The half least-promising configurations are thrown out, the remaining are further trained along with a selection of new configurations. The size of these populations is sensitive to resource constraints (e.g. allotted search time). [Reference Paper](https://arxiv.org/pdf/1603.06560.pdf)|
@@ -27,7 +27,9 @@ Currently, we support the following algorithms:
2727

2828
Using a built-in tuner provided by the NNI SDK requires one to declare the **builtinTunerName** and **classArgs** in the `config.yml` file. In this part, we will introduce each tuner along with information about usage and suggested scenarios, classArg requirements, and an example configuration.
2929

30-
Note: Please follow the format when you write your `config.yml` file. Some built-in tuners need to be installed using `nnictl package`, like SMAC.
30+
Note: Please follow the format when you write your `config.yml` file. Some built-in tuners have
31+
dependencies need to be installed using `pip install nni[<tuner>]`, like SMAC's dependencies can
32+
be installed using `pip install nni[SMAC]`.
3133

3234
<a name="TPE"></a>
3335

@@ -144,10 +146,10 @@ tuner:
144146

145147
**Installation**
146148

147-
SMAC needs to be installed by following command before the first usage. As a reminder, `swig` is required for SMAC: for Ubuntu `swig` can be installed with `apt`.
149+
SMAC has dependencies need to be installed by following command before the first usage. As a reminder, `swig` is required for SMAC: for Ubuntu `swig` can be installed with `apt`.
148150

149151
```bash
150-
nnictl package install --name=SMAC
152+
pip install nni[SMAC]
151153
```
152154

153155
**Suggested scenario**
@@ -340,7 +342,7 @@ tuner:
340342
BOHB advisor requires [ConfigSpace](https://github.com/automl/ConfigSpace) package. ConfigSpace can be installed using the following command.
341343

342344
```bash
343-
nnictl package install --name=BOHB
345+
pip install nni[BOHB]
344346
```
345347

346348
**Suggested scenario**

0 commit comments

Comments
 (0)