Skip to content

Commit

Permalink
feat: Support random state in random search (#873)
Browse files Browse the repository at this point in the history
* feat: Support random search

Signed-off-by: Ce Gao <[email protected]>

* feat: Update docs

Signed-off-by: Ce Gao <[email protected]>
  • Loading branch information
gaocegege authored and k8s-ci-robot committed Oct 11, 2019
1 parent 030b691 commit fb6739c
Show file tree
Hide file tree
Showing 3 changed files with 26 additions and 5 deletions.
9 changes: 7 additions & 2 deletions docs/algorithm-settings.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,12 @@ Random sampling is an alternative to grid search when the number of discrete par

### [Hyperopt][]

Algorithm name in katib is `random`.
Algorithm name in katib is `random`, and there are some algortihm settings that we support:

| Setting Name | Description | Example |
|------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------|
| random_state | [int]: Set random state to something other than None for reproducible results. | 10 |


## TPE

Expand Down Expand Up @@ -75,7 +80,7 @@ Algorithm name in katib is `skopt-bayesian-optimization`, and there are some alg
| n_initial_points | [int, default=10]: Number of evaluations of `func` with initialization points before approximating it with `base_estimator`. Points provided as `x0` count as initialization points. If len(x0) < n_initial_points additional points are sampled at random. More in [skopt document](https://scikit-optimize.github.io/#skopt.Optimizer) | 10 |
| acq_func | [string, default=`"gp_hedge"`]: Function to minimize over the posterior distribution. More in [skopt document](https://scikit-optimize.github.io/#skopt.Optimizer) | gp_hedge |
| acq_optimizer | [string, "sampling" or "lbfgs", default="auto"]: Method to minimize the acquistion function. The fit model is updated with the optimal value obtained by optimizing acq_func with acq_optimizer. More in [skopt document](https://scikit-optimize.github.io/#skopt.Optimizer) | auto |
| random_state | [int, RandomState instance, or None (default)]: Set random state to something other than None for reproducible results. | 10 |
| random_state | [int]: Set random state to something other than None for reproducible results. | 10 |

## References

Expand Down
5 changes: 3 additions & 2 deletions pkg/suggestion/v1alpha3/hyperopt/base_hyperopt_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,8 @@
logger = logging.getLogger("BaseHyperoptService")

class BaseHyperoptService(object):
def __init__(self, algorithm_name="tpe"):
def __init__(self, algorithm_name="tpe", random_state=None):
self.random_state = random_state
if algorithm_name == 'tpe':
self.hyperopt_algorithm = hyperopt.tpe.suggest
elif algorithm_name == 'random':
Expand Down Expand Up @@ -42,7 +43,7 @@ def getSuggestions(self, search_space, trials, request_number):
hyperopt_search_space[param.name] = hyperopt.hp.choice(
param.name, param.list)
# New hyperopt variables
hyperopt_rstate = np.random.RandomState()
hyperopt_rstate = np.random.RandomState(self.random_state)
hyperopt_domain = hyperopt.Domain(
None, hyperopt_search_space, pass_expr_memo_ctrl=None)

Expand Down
17 changes: 16 additions & 1 deletion pkg/suggestion/v1alpha3/hyperopt_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,12 +17,27 @@ def GetSuggestions(self, request, context):
"""
Main function to provide suggestion.
"""
name, config = OptimizerConfiguration.convertAlgorithmSpec(
request.experiment.spec.algorithm)
base_serice = BaseHyperoptService(
algorithm_name=request.experiment.spec.algorithm.algorithm_name)
algorithm_name=name, random_state=config.random_state)
search_space = HyperParameterSearchSpace.convert(request.experiment)
trials = Trial.convert(request.trials)
new_assignments = base_serice.getSuggestions(
search_space, trials, request.request_number)
return api_pb2.GetSuggestionsReply(
parameter_assignments=Assignment.generate(new_assignments)
)


class OptimizerConfiguration(object):
def __init__(self, random_state=None):
self.random_state = random_state

@staticmethod
def convertAlgorithmSpec(algorithm_spec):
optmizer = OptimizerConfiguration()
for s in algorithm_spec.algorithm_setting:
if s.name == "random_state":
optmizer.random_state = int(s.value)
return algorithm_spec.algorithm_name, optmizer

0 comments on commit fb6739c

Please sign in to comment.