Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing a Range Parameter To Fixed #383

Closed
kjanoudi opened this issue Sep 12, 2020 · 12 comments
Closed

Changing a Range Parameter To Fixed #383

kjanoudi opened this issue Sep 12, 2020 · 12 comments
Assignees
Labels
bug Something isn't working fixready Fix has landed on master.

Comments

@kjanoudi
Copy link

At some point during my experiment, some of my integer parameters that were originally Range params converge to a single value. At that point, instead of trimming the range to two values, I create a new search space, replace those parameters in the search space with a single FixedParameter, then set the search space on the experiment. However, i then begin receiving this error:

I'm curious why this happens. Here is an example:

My parameter, "source" converged to 3 and was replaced by a FixedParameter.

The params in the function are:

obsf.parameters: {'source': 2} 
self.fixed_parameters: { 'source': FixedParameter(name='source', parameter_type=INT, value=3)}

The error is:

Fixed parameter source with out of design value: 2 passed to RemoveFixed.

My only guess is that it is still generating values for the Fixed parameters because they used to be RangeParams, and the old experiment is being loaded from the database as a RangeParam.

@2timesjay
Copy link
Contributor

At its root this is a bug due to overly strict validation when we have fixed parameters.

Your best option right now is to not change the search_space, but instead use the fixed_features kwarg when calling gen.

In the developer API, where you call "gen" https://github.com/facebook/Ax/blob/master/ax/modelbridge/base.py#L576 to generate a new proposed arm (see: https://ax.dev/tutorials/gpei_hartmann_developer.html#4.-Perform-Optimization) , add code like:

    fixed_features = ObservationFeatures(parameters={"x0": 1})
    batch = exp.new_trial(generator_run=gpei.gen(1, fixed_features=fixed_features))

Another workaround is to do what you're doing but narrow the search space instead of fixing it. It is not ideal but may still improve performance.

@belotelov
Copy link

Does fixing parameters supposed to work when fixing Range parameters which hadnt yet converged to some value? We started some experiment in multiple dimensions (all dimensions are continuous and corresponding parameters types are floats) and would like to reduce number of dimensions during the experiment fixing some parameters to some values. Using fixed_features parameter of gen function brings this error: ValueError: There are no feasible observed points.

@ldworkin
Copy link
Contributor

@belotelov , is it possible for you to get us a reproducible example?

@Balandat
Copy link
Contributor

Balandat commented Mar 2, 2021

Taking a step back, why would you

like to reduce number of dimensions during the experiment fixing some parameters to some values

Are there any issues with the optimization that you can observe? The model & optimization should be able to handle this.

@kjanoudi
Copy link
Author

kjanoudi commented Mar 2, 2021

If I may contribute with my use case - it seemed to me that using a Fixed parameter instead of a range sped the model up considerably when generating the next trial. I was able to hack in a solution and it definitely was a lot faster when i swapped range for fixed after a range param had converged enough.

@tonyhammainen
Copy link

tonyhammainen commented Mar 4, 2021

In #515 I was asking if something akin to this thread is doable with Ax: fixing range parameter values (or updating more strict bounds) later in the experimentation, while still using all previous data points to learn from regardless of if they have had that fixed value (or are within the new stricter bounds) or not, but have the generated trials be within this spec.

I can think of two reasons for this,

  1. It enables running what if/sensitivity-analyses, which acts as a tool to transfer the model's learnings to practitioners.

  2. The physical world context of a black box function can result in particular parameter values being inherently more convenient than others. Think package sizes, where partial units are possible to measure & evaluate, but whole units are easier to use. Then whether the convenience trade-off makes sense or not depends on how large the difference in objective is.

@belotelov
Copy link

@belotelov , is it possible for you to get us a reproducible example?

@ldworkin Yep, here is zip with code and data files, there are two model.gen() calls, first with all range parameters as they present in search space and second with some parameter fixed

ax_fixparams.zip

Taking a step back, why would you

like to reduce number of dimensions during the experiment fixing some parameters to some values

Are there any issues with the optimization that you can observe? The model & optimization should be able to handle this.

@Balandat After running some trials and watching parameters we learnt that it might take too many trials for all of them to converge (because of the nature of experiment it lakes quite a time to run a trial). From previous experiments we have some educated guess on some parameters values, so we decided fix those paraments and explore the rest of parameters space

@2timesjay 2timesjay added the fixready Fix has landed on master. label Mar 9, 2021
@2timesjay
Copy link
Contributor

@belotelov, @tonyhammainen

the solution using "fixed_features" in gen (see #383 (comment) ) should work once a recently landed fix in master is released. This should also give the desired behavior in terms of fitting the model to the full data but only generating points in the selected slice.

The fix is here: f1d7759 It prevents overly aggressive filtering when new constraints are added mid-optimization.

An alternative before that release: run a trial manually with an arm that matches your fixed_features, then proceeding with the fixed_features kwarg in gen.

We generally don't have good support for editing the types of parameters in a search space directly.

@belotelov
Copy link

Thanks a lot, that patch solved problem!

@XRen615
Copy link

XRen615 commented May 19, 2021

@belotelov, @tonyhammainen

the solution using "fixed_features" in gen (see #383 (comment) ) should work once a recently landed fix in master is released. This should also give the desired behavior in terms of fitting the model to the full data but only generating points in the selected slice.

The fix is here: f1d7759 It prevents overly aggressive filtering when new constraints are added mid-optimization.

An alternative before that release: run a trial manually with an arm that matches your fixed_features, then proceeding with the fixed_features kwarg in gen.

We generally don't have good support for editing the types of parameters in a search space directly.

thanks @2timesjay!
I tried with the latest version (0.1.20). seems the fix only works with fixed feature, but not narrowed space, right?
oh, I noticed it hasn't been shipped to the stable version, plz ignore my question above.

@lena-kashtelyan
Copy link
Contributor

@XRen615, yep the stable version is delayed a bit this time due to some pretty exciting updates we plan to publish with it : ) It should be coming towards the end of June.

@lena-kashtelyan
Copy link
Contributor

This is now fixed and included with latest stable release, 0.2.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working fixready Fix has landed on master.
Projects
None yet
Development

No branches or pull requests

8 participants