Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat!: use gradient method for the optimisation part #31

Merged
merged 50 commits into from
Sep 19, 2023

Conversation

leo-desbureaux-tellae
Copy link
Contributor

@leo-desbureaux-tellae leo-desbureaux-tellae commented Sep 17, 2023

We change the optimisation algorithm (computation of crossed modalities probabilities) for a self implemented one, in order to get rid of the maxentropy package which is no longer maintained. Enrichment classes are given more consistent name.
The global quality of the resulting population is improved by this new implementation (better fit to the marginal distributions).

Former modules, classes and tests are kept in this commit to provide a comparison environment if needed. They will be removed later.

CAUTION : these changes introduced a problem where the test_bhepop2_enrich test would pass locally but not on github. See #34

BREAKING CHANGE:

  • change in the optimisation algorithm means that the results will change, hence breaking reproducibility
  • main enrichment class is renamed Bhepop2Enrichment

Close: #28

PierreOlivierVandanjon and others added 30 commits April 19, 2023 15:55
…us avancée que Nelder Mead, demo.py est la version python de demo.ipynb dans laquelle a été ajoutée la classe class MaxEntropyEnrichment_gradient:
…radient puis mis en commentaire car cela ne changeait pas grand chose
…us avancée que Nelder Mead, demo.py est la version python de demo.ipynb dans laquelle a été ajoutée la classe class MaxEntropyEnrichment_gradient:
…radient puis mis en commentaire car cela ne changeait pas grand chose
# Conflicts:
#	bhepop2/gradient_enrich.py
#	examples/demo.ipynb
@leo-desbureaux-tellae leo-desbureaux-tellae changed the title feat: use gradient method for the optimisation part feat!: use gradient method for the optimisation part Sep 18, 2023
@leo-desbureaux-tellae leo-desbureaux-tellae merged commit 518deef into main Sep 19, 2023
1 check passed
@leo-desbureaux-tellae leo-desbureaux-tellae deleted the gradient branch September 19, 2023 10:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Mise au propre du gradient
2 participants