Skip to content

Conversation

NathanHB
Copy link
Member

@NathanHB NathanHB commented Mar 10, 2025

  • removed override batch size and made it part of model config
  • remove env config, config should be part of env directly, no need to load it up
  • better loading of models
    • baseclass for model configs, allows parsing of model config through cli or config file.
    • Unified naming for model args, i.e. model_name
  • removed openai endpoint, we can just use litellm for this, same for TGI and inference endpoints we don't really need it and it's better to have one interface

padding_side="left",
truncation_side="left",
)
except FileNotFoundError:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should not be needed as it is an issue with tokenizer version or config

@HuggingFaceDocBuilderDev
Copy link
Collaborator

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@NathanHB NathanHB merged commit feb8b70 into main Apr 14, 2025
5 checks passed
hynky1999 pushed a commit that referenced this pull request May 22, 2025
- removed override batch size and made it part of model config
- remove env config, config should be part of env directly, no need to load it up
- better loading of models
	- baseclass for model configs, allows parsing of model config through cli or config file.
	- Unified naming for model args, i.e. `model_name`
- removed openai endpoint, we can just use litellm for this, same for TGI and inference endpoints we don't really need it and it's better to have one interface
@fxmarty-amd
Copy link

see: #857

NathanHB added a commit that referenced this pull request Sep 19, 2025
- removed override batch size and made it part of model config
- remove env config, config should be part of env directly, no need to load it up
- better loading of models
	- baseclass for model configs, allows parsing of model config through cli or config file.
	- Unified naming for model args, i.e. `model_name`
- removed openai endpoint, we can just use litellm for this, same for TGI and inference endpoints we don't really need it and it's better to have one interface
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants