-
Notifications
You must be signed in to change notification settings - Fork 227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
enable HFDetector model configuration with hf_args #810
enable HFDetector model configuration with hf_args #810
Conversation
Signed-off-by: Jeffrey Martin <[email protected]>
Signed-off-by: Jeffrey Martin <[email protected]>
I anticipate |
Thanks for this, working great. Q - both This ofc works:
Configuring the base class as below didn't work, I guess predicated on how config matching is applied, probably class name matching is more conservative/tidy than
Is there a way to configure groups of plugins? |
Side issue - moving HFDetector to GPU exposes a further possible optimisation through using an HF dataset. This message comes up, presumably from
The optimisation is out of scope here; tracking in #812 |
Yes although it might be too broad in some detectors. We can configure at the module level. Currently there is no support for configuration of all implemented classes of a base plugin. This would provide
While I like the idea of having less configuration required, I think in practical usage explicit configuration for a base class would be overkill and requires more internal knowledge of the implementation structure than would be typical for most users. I am skeptical of the trade off in value for powerusers vs complexity, maintenance and support for configuration by instance type. |
Yeah, if doing this to base classes requires extra code I'm having a hard time justifying that too. Thanks for the yaml, this works fine. |
Fix #803
Allows
hf_args
configuration for simpleHFDetectors
:cpu
device when no configuration is provided for the detectorfrom_pretrained
MustContradictNLI.detect()
places prompt for evaluation in the configureddevice
spacedetector_model_path
anddetector_target_class
Example config of detector using
cuda
:detector_cuda.yaml
The format above is based on
HFDetector
being a class specifically designed to use aTextClassificationPipeline
. Is the is the approach desired or is further abstraction ofModelAsJudge
be a goal of this revision?Test example:
Logs and
nvidia-smi
show detector model loading in GPU.More validation of compatible models may be desired if this pattern is determined to be a way forward.