Exporters automatic task detection#445
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
optimum/exporters/tasks.py
Outdated
| # TODO: implement this. | ||
| raise NotImplementedError("Cannot infer the task from a local directory yet.") |
There was a problem hiding this comment.
not sure we even want to support this, in a first version at least
There was a problem hiding this comment.
Alright, changed the message and the exception to something that does not open the door for future support (even though it might come someday).
julien-c
left a comment
There was a problem hiding this comment.
looks food from quick glance!
optimum/exporters/tasks.py
Outdated
| transformers_info = model_info.transformersInfo | ||
| if transformers_info is None or transformers_info.get("auto_model") is None: | ||
| raise RuntimeError(f"Could not infer the task from the model repo {model_name_or_path}") | ||
| auto_model_class_name = f"{class_name_prefix}{transformers_info['auto_model']}" |
There was a problem hiding this comment.
i think auto_model can start with TF already in some cases so you just only prepend it if it's not already there
|
BTW context for anyone reading this PR: the Hub determines AutoModel-type from models' config.json using:
|
|
This will pave the way to build a really cool "official" Space under the official ONNX org, WDYT @fxmarty? e.g. under https://huggingface.co/spaces/onnx/convert (for consistency with other converters) |
|
LGTM, hopefully we put out a space tonight or tomorrow! |
What does this PR do?
This PR enables automatic task detection for
optimum.exporters.