You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 13, 2023. It is now read-only.
Follow-up for #664. Sometimes it's desired to store predictions along with the specific model version that returned those predictions. There are at least two ways to support that in MLEM:
Return model version in prediction - then what's returned will be a json like {"prediction": [0.4, 0.6], "version": 0.1.3}. I've seen some generic ML frameworks doing this IIRC.
Return it in interface.json - we already have MLEM version there, so adding model version looks logical
Regarding how we get this info into the service. Again, there are two approaches:
Add it at mlem.api.save
Allow to specify it when building server
First seems more reasonable to me. Since this will require some under-the-hood integration with GTO, I'd do this after #664 - which have the same decision to make.