Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It seems that the time complexity is O(2^S), where S is num_to_sample #464

Open
hzhangxyz opened this issue Jul 27, 2016 · 1 comment
Open

Comments

@hzhangxyz
Copy link

so it is hard to do some parallel computing, is there some solution?

besides, could you mention it in the document? not only "O(N^3), where N is the number of historical points"

@wujian16
Copy link

I have not understood where O(2^S) comes from, please refer to the paper "Parallel Bayesian Global Optimization of Expensive Functions" for details.

At the same time, you should try running the code to appreciate how fast the code can be for the number of evaluations up to 500 ( and num_to_sample up to 20).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants