Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pseudo genetic search #47

Closed
Borda opened this issue Aug 4, 2019 · 7 comments · May be fixed by #777
Closed

pseudo genetic search #47

Borda opened this issue Aug 4, 2019 · 7 comments · May be fixed by #777
Labels
enhancement New feature or request

Comments

@Borda
Copy link

Borda commented Aug 4, 2019

Hello, I found this package very interesting, especially the way how the parameters are integrated inside model creation/design. So far there is implemented bayesian, hyperband, random search... Recently we found interesting to use kind of genetic search where you are mixing the best parameters together and randomly change some of them... Would you be interested in having something like sklearn-genetic in this package, creating a PR? Thx

@ghost
Copy link

ghost commented Aug 5, 2019

Hi, I am also a fan of Keras Tuner and its huge potential in addressing practical business problems by removing some of the human imperfections inherent to Hyperparameter Tuning.
I agree. It would be a great idea to add genetic search to the randomsearch and hyperband algorithms currently available, along with guidelines on when to use one instead of the other.(a selection guide of some kind)

@Borda
Copy link
Author

Borda commented Aug 15, 2019

@ebursztein @jamlong @fchollet ^^

@omalleyt12
Copy link
Contributor

@Borda @algit123 That's great to hear! Thanks for your interest in implementing a genetic search algorithm. The internals of how the Oracle class changes are still subject to changes ahead of the 1.0 release, mainly because we are adding support for distributed tuning, so you'd probably want to wait before implementing a new algorithm (likely just 3-4 more weeks until API stability)

But yes our intention is to make it easy to subclass the Oracle class to add new algorithms. I'll update this thread when the API is stable enough that you wouldn't have to rewrite the Oracle.

Probably the best process then if you'd like to see genetic search included in this repo is to subclass the Oracle in your own repo and try it out on some models to show circumstances under which it can converge faster than other algorithms for a class of neural networks

@omalleyt12
Copy link
Contributor

The Oracle class is now stable enough to implement subclasses. Marking as contributions welcome. Any PR with an implementation should also provide a link to a repo with reproducible benchmarks

@lsgrep
Copy link

lsgrep commented Nov 23, 2019

This is really interesting, and I would like to take a stab at it.

@vb690
Copy link

vb690 commented Oct 29, 2020

Very interested in this, It would be awsome to have an implementation of something like Map-Elites.

Anselmoo added a commit to Anselmoo/keras-tuner that referenced this issue Dec 5, 2022
The proposed implementation of a genetic algorithm for hyper optimization.

Even if genetic optimization might be costly for CNN, the applications in numeric analysis or Design of Experiment (DoE) make it still interesting.

Fixes: keras-team#47


Ref:

1. [Vishwakarma G, et al Towards Autonomous Machine Learning in Chemistry via Evolutionary Algorithms. **ChemRxiv.**](https://chemrxiv.org/engage/api-gateway/chemrxiv/assets/orp/resource/item/60c7445a337d6c2849e26d98/original/towards-autonomous-machine-learning-in-chemistry-via-evolutionary-algorithms.pdf)
2. [Rosanna Nichols et al 2019 _Quantum Sci. Technol._ **4** 045012](https://iopscience.iop.org/article/10.1088/2058-9565/ab4d89/meta?casa_token=db7uZRqRMEAAAAAA:fRO9qB25dAkeoskS6MMyzpZw2jSiMkpsN4zA_k6lheWUXaSUU8fPS-JPMoNFcIl9tka4OPCG5AtDtiM)
Anselmoo added a commit to Anselmoo/keras-tuner that referenced this issue Dec 5, 2022
The proposed implementation of a genetic algorithm for hyper optimization.

Even if genetic optimization might be costly for CNN, the applications in numeric analysis or Design of Experiment (DoE) make it still interesting.

Fixes: keras-team#47


Further Reading:

1. [Vishwakarma G, et al Towards Autonomous Machine Learning in Chemistry via Evolutionary Algorithms. **ChemRxiv.**](https://chemrxiv.org/engage/api-gateway/chemrxiv/assets/orp/resource/item/60c7445a337d6c2849e26d98/original/towards-autonomous-machine-learning-in-chemistry-via-evolutionary-algorithms.pdf)
2. [Rosanna Nichols et al 2019 _Quantum Sci. Technol._ **4** 045012](https://iopscience.iop.org/article/10.1088/2058-9565/ab4d89/meta?casa_token=db7uZRqRMEAAAAAA:fRO9qB25dAkeoskS6MMyzpZw2jSiMkpsN4zA_k6lheWUXaSUU8fPS-JPMoNFcIl9tka4OPCG5AtDtiM)
@haifeng-jin
Copy link
Collaborator

I am closing the issue as we are not actively accepting new tuning algorithms unless it is proved to have a performance gain on a number of use cases.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants