Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GUI Training: SLEAP does not use non-GUI parameters when using training config as template #1052

Closed
roomrys opened this issue Nov 18, 2022 · 2 comments
Labels
bug Something isn't working

Comments

@roomrys
Copy link
Collaborator

roomrys commented Nov 18, 2022

Bug description

Loading a training config in the GUI does not use the loaded training config's hidden parameters, i.e. parameters not displayed in the GUI.

Expected behaviour

The loaded training config's parameters will be used as defaults for parameters which are not set in the GUI.

Actual behaviour

Parameters set in GUI are used, but parameters not set in GUI have defaults determined by TrainingJobConfig class in code.

Your personal set up

Screenshots

How to reproduce

  1. Go to Training Pipeline in GUI
  2. Click on Model Configuration tab
  3. Select model from drop-down.
  4. Run
@roomrys roomrys added the bug Something isn't working label Nov 18, 2022
@roomrys
Copy link
Collaborator Author

roomrys commented Nov 18, 2022

Relevant Code

  • run calls get_ever_head_config_data

    def run(self):
    """Run with current dialog settings."""
    pipeline_form_data = self.pipeline_form_widget.get_form_data()
    items_for_inference = self.get_items_for_inference(pipeline_form_data)
    config_info_list = self.get_every_head_config_data(pipeline_form_data)
    # Close the dialog now that we have the data from it
    self.accept()
    # Run training/learning pipeline using the TrainingJobs
    new_counts = runners.run_learning_pipeline(
    labels_filename=self.labels_filename,
    labels=self.labels,
    config_info_list=config_info_list,
    inference_params=pipeline_form_data,
    items_for_inference=items_for_inference,
    )
    self._handle_learning_finished.emit(new_counts)
    # count < 0 means there was an error and we didn't get any results.
    if new_counts is not None and new_counts >= 0:
    total_count = items_for_inference.total_frame_count
    no_result_count = total_count - new_counts
    message = (
    f"Inference ran on {total_count} frames."
    f"\n\nInstances were predicted on {new_counts} frames "
    f"({no_result_count} frame{'s' if no_result_count != 1 else ''} with "
    "no instances found)."
    )
    win = QtWidgets.QMessageBox(text=message)
    win.setWindowTitle("Inference Results")
    win.exec_()

  • get_every_head_config_data calls make_training_config_from_key_val_dict

    def get_every_head_config_data(
    self, pipeline_form_data
    ) -> List[configs.ConfigFileInfo]:
    cfg_info_list = []
    # Copy relevant data into linked fields (i.e., anchor part).
    self.adjust_data_to_update_other_tabs(pipeline_form_data)
    for tab_name in self.shown_tab_names:
    trained_cfg_info = self.tabs[tab_name].trained_config_info_to_use
    if trained_cfg_info:
    trained_cfg_info.dont_retrain = trained_cfg_info
    cfg_info_list.append(trained_cfg_info)
    else:
    tab_cfg_key_val_dict = self.tabs[tab_name].get_all_form_data()
    self.merge_pipeline_and_head_config_data(
    head_name=tab_name,
    head_data=tab_cfg_key_val_dict,
    pipeline_data=pipeline_form_data,
    )
    cfg = scopedkeydict.make_training_config_from_key_val_dict(
    tab_cfg_key_val_dict
    )
    cfg_info = configs.ConfigFileInfo(config=cfg, head_name=tab_name)
    cfg_info_list.append(cfg_info)
    return cfg_info_list

  • make_training_config_from_key_val_dict creates TrainingJobConfig

    def make_training_config_from_key_val_dict(key_val_dict: dict) -> TrainingJobConfig:
    """
    Make :py:class:`TrainingJobConfig` object from flat dictionary.
    Arguments:
    key_val_dict: Flat dictionary from :py:class:`TrainingEditorWidget`.
    Returns:
    The :py:class:`TrainingJobConfig` object.
    """
    apply_cfg_transforms_to_key_val_dict(key_val_dict)
    cfg_dict = ScopedKeyDict(key_val_dict).to_hierarchical_dict()
    cfg = cattr.structure(cfg_dict, TrainingJobConfig)
    return cfg

  • TrainingJobConfig has defaults for values not specified in GUI (outputs = OutputsConfig)

    class TrainingJobConfig:
    """Configuration of a training job.
    Attributes:
    data: Configuration options related to the training data.
    model: Configuration options related to the model architecture.
    optimization: Configuration options related to the training.
    outputs: Configuration options related to outputs during training.
    name: Optional name for this configuration profile.
    description: Optional description of the configuration.
    sleap_version: Version of SLEAP that generated this configuration.
    filename: Path to this config file if it was loaded from disk.
    """
    data: DataConfig = attr.ib(factory=DataConfig)
    model: ModelConfig = attr.ib(factory=ModelConfig)
    optimization: OptimizationConfig = attr.ib(factory=OptimizationConfig)
    outputs: OutputsConfig = attr.ib(factory=OutputsConfig)
    name: Optional[Text] = ""
    description: Optional[Text] = ""
    sleap_version: Optional[Text] = sleap.__version__
    filename: Optional[Text] = ""

@roomrys roomrys added open pr A fix has been written, but is still being reviewed. fixed in future release Fix or feature is merged into develop and will be available in future release. and removed open pr A fix has been written, but is still being reviewed. labels Nov 23, 2022
@roomrys
Copy link
Collaborator Author

roomrys commented Feb 24, 2023

This feature is now available in the (pre) release 1.3.0a0, to install, first uninstall and then:
conda (Windows/Linux/GPU):

conda create -y -n sleap -c sleap -c sleap/label/dev -c nvidia -c conda-forge sleap=1.3.0a0

pip (any OS except Apple Silicon):

pip install sleap==1.3.0a0

Warning: This is a pre-release! Expect bugs and strange behavior when testing.

@roomrys roomrys closed this as completed Feb 24, 2023
@roomrys roomrys removed the fixed in future release Fix or feature is merged into develop and will be available in future release. label Feb 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant