Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reinistate integration tests for SymbolicRegression? #1152

Open
ablaom opened this issue Jan 5, 2025 · 5 comments
Open

Reinistate integration tests for SymbolicRegression? #1152

ablaom opened this issue Jan 5, 2025 · 5 comments

Comments

@ablaom
Copy link
Member

ablaom commented Jan 5, 2025

The issue MilesCranmer/SymbolicRegression.jl#390 is now resolved. However, the models are extremely slow to train, relatively to others; integration tests on tiny data sets take some minutes (> 10min) apparently due to inappropriate default hyper-parameters for small data.

I propose removing SymbolicRegression altogether from the tests, as not really needed for testing integration. We have plenty of other models of that generic type.

@MilesCranmer

@ablaom ablaom changed the title Integration tests for SymbolicRegression are very slow. Reinistate integration tests for SymbolicRegression? Jan 5, 2025
@MilesCranmer
Copy link

If just needing a quick test, you could set .niterations=1? The hyperparameter defaults are much beefier.

@ablaom
Copy link
Member Author

ablaom commented Jan 8, 2025

Thanks for the suggestion, but integration tests only test default values.

@MilesCranmer
Copy link

I’m not sure I understand the issue here. Do you mean that the defaults need to be under a certain compute budget as part of the formal interface? But different algorithms have different costs, by their very nature. Maybe there could be a MLJModelInterface.test_defaults(::Regressor) trait to recommend defaults for unit test purposes?

@MilesCranmer
Copy link

MilesCranmer commented Feb 3, 2025

Hi @ablaom,

I have created SymbolicRegression.SRTestRegressor and SymbolicRegression.MultitargetSRTestRegressor which have lightweight defaults. Let me know if that helps.

Best,
Miles

@ablaom
Copy link
Member Author

ablaom commented Feb 11, 2025

Thanks for this. This might potentially help, but, as mentioned earlier, I think we have sufficient models for testing MLJ integration. If you want, you could add the integration tests yourself locally, as I think I sketched out earlier.

@Moef At present integration tests don't catch warnings like you encountered before, although I suppose this could be added.

The way the registration works is that all models in the package get added. (That is, you register a package, not individual models.) So these new models will be discoverable to the general MLJ user. I can mask out the integration tests for the "normal" models, as present, but to "hide" the test models from the user would require a hack, which I am reluctant to add. So, only add these if you are happy for the general user to see them.

Incidentally, if you add them, integration tests will be automatically included in the next MLJModels/MLJ update cycle.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants