-
Notifications
You must be signed in to change notification settings - Fork 96
Description
I noticed that there is a problem with ordinal hyperparameters.
I do not know if it is a bug or a feature but they do not behave like
integer hyperparameters.
Here is an example of what happens:
Define ConfigSpace in the .pcs file:
batch_size integer [8, 128] [64] log
sequence_size ordinal {128, 256, 512, 1024, 2048, 4096} [512]
Load this ConfigSpace and then sample random values and convert to a vector:
config_space = ... # Some loading code
sample_vector = config_space.sample_configuration().get_array()
#Corresponding Array:
#[ 0.40941199 2. ]
Now convert back to the Configuration:
sample = Configuration(config_space, vector=sample_vector)
Works nicely:
batch_size, Value: 24
sequence_size, Value: '512'*
Add some noise to the vector (BOHB does that when sampling from the model,
it treats ordinal, integer and continuous parameters in the same way)
sample_vector[0] += 0.1
sample_vector[1] += 0.1
#Corresponding Array:
#[ 0.50941199 2.1 ]
Now convert back to the Configuration:
sample = Configuration(config_space, vector=sample_vector)
For some reason sequence_size disappears.
Configuration:
batch_size, Value: 32
Is that intended? Shouldn't ordinal hyperparameter have the same representation as integer hyperparameter when transformed to an array? Now it has representation that is similar to categorical hyperparameter.