-
Notifications
You must be signed in to change notification settings - Fork 321
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changing a Range Parameter To Fixed #383
Comments
At its root this is a bug due to overly strict validation when we have fixed parameters. Your best option right now is to not change the search_space, but instead use the fixed_features kwarg when calling gen. In the developer API, where you call "gen" https://github.com/facebook/Ax/blob/master/ax/modelbridge/base.py#L576 to generate a new proposed arm (see: https://ax.dev/tutorials/gpei_hartmann_developer.html#4.-Perform-Optimization) , add code like:
Another workaround is to do what you're doing but narrow the search space instead of fixing it. It is not ideal but may still improve performance. |
Does fixing parameters supposed to work when fixing Range parameters which hadnt yet converged to some value? We started some experiment in multiple dimensions (all dimensions are continuous and corresponding parameters types are floats) and would like to reduce number of dimensions during the experiment fixing some parameters to some values. Using fixed_features parameter of gen function brings this error: |
@belotelov , is it possible for you to get us a reproducible example? |
Taking a step back, why would you
Are there any issues with the optimization that you can observe? The model & optimization should be able to handle this. |
If I may contribute with my use case - it seemed to me that using a Fixed parameter instead of a range sped the model up considerably when generating the next trial. I was able to hack in a solution and it definitely was a lot faster when i swapped range for fixed after a range param had converged enough. |
In #515 I was asking if something akin to this thread is doable with Ax: fixing range parameter values (or updating more strict bounds) later in the experimentation, while still using all previous data points to learn from regardless of if they have had that fixed value (or are within the new stricter bounds) or not, but have the generated trials be within this spec. I can think of two reasons for this,
|
@ldworkin Yep, here is zip with code and data files, there are two model.gen() calls, first with all range parameters as they present in search space and second with some parameter fixed
@Balandat After running some trials and watching parameters we learnt that it might take too many trials for all of them to converge (because of the nature of experiment it lakes quite a time to run a trial). From previous experiments we have some educated guess on some parameters values, so we decided fix those paraments and explore the rest of parameters space |
the solution using "fixed_features" in gen (see #383 (comment) ) should work once a recently landed fix in master is released. This should also give the desired behavior in terms of fitting the model to the full data but only generating points in the selected slice. The fix is here: f1d7759 It prevents overly aggressive filtering when new constraints are added mid-optimization. An alternative before that release: run a trial manually with an arm that matches your fixed_features, then proceeding with the fixed_features kwarg in gen. We generally don't have good support for editing the types of parameters in a search space directly. |
Thanks a lot, that patch solved problem! |
thanks @2timesjay! |
@XRen615, yep the stable version is delayed a bit this time due to some pretty exciting updates we plan to publish with it : ) It should be coming towards the end of June. |
This is now fixed and included with latest stable release, 0.2.0. |
At some point during my experiment, some of my integer parameters that were originally Range params converge to a single value. At that point, instead of trimming the range to two values, I create a new search space, replace those parameters in the search space with a single FixedParameter, then set the search space on the experiment. However, i then begin receiving this error:
Ax/ax/modelbridge/transforms/remove_fixed.py
Line 46 in 1febe33
I'm curious why this happens. Here is an example:
My parameter, "source" converged to 3 and was replaced by a FixedParameter.
The params in the function are:
The error is:
Fixed parameter source with out of design value: 2 passed to
RemoveFixed
.My only guess is that it is still generating values for the Fixed parameters because they used to be RangeParams, and the old experiment is being loaded from the database as a RangeParam.
The text was updated successfully, but these errors were encountered: