Skip to content

Conversation

NathanHB
Copy link
Member

fixes and revamps the from_model function.

The function now can only take in a transformers loaded model, not a lighteval model. We must also provide a lighteval TransformerModelConfig instead of the multiple optional args.

@HuggingFaceDocBuilderDev
Copy link
Collaborator

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copilot

This comment was marked as outdated.

@NathanHB NathanHB marked this pull request as draft August 14, 2025 12:41
@NathanHB NathanHB marked this pull request as ready for review August 14, 2025 13:23
@NathanHB NathanHB requested a review from Copilot August 14, 2025 13:23
Copilot

This comment was marked as outdated.

return load_model(config=model_config)
if isinstance(model, TransformersModel):

if isinstance(model, LightevalModel):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hm so model > model_config? maybe we should print it somewhere

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wdym ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you first check if there is a model config, then if not you work with the model - so if you provide both a model config and model, the model is completely skipped

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mhh no, I first check if model is None, if there is no model provided we use the model_config else, we check if the model is a LightevalModel, else we do from_model.
Maybe I'm missing something in the logic here ?

Copy link
Member

@clefourrier clefourrier Aug 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hm sorry, you're right, it's the other way around - if you provide a model you ignore the config going with

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should keep a system where we either provide the model or the config but never both, because here if you provide model + config for a LightevalModel, the config just gets ignored

Copy link
Member

@clefourrier clefourrier Aug 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or more specifically, put the logic the other way around:

if model and config: # both exist 
    if transformers model:
        from pretrained
    else:
        raise exception

elif model: # no config threfore
   return model
   
elif config: # no model here
   return model from config

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah but i log a warning (line below) if the user provided a config with a lightevalModel

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's clearer ! Changing :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cool! I think we should just assume users will ignore warnings so we should make the code base not bugguable easily ^^"

Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR refactors the from_model function in TransformersModel to accept only HuggingFace transformers models and requires a TransformerModelConfig instead of multiple optional parameters. The changes streamline model initialization and improve type safety.

  • Simplified from_model function signature and removed support for LightevalModel instances
  • Updated pipeline initialization logic to handle different model/config combinations more clearly
  • Added comprehensive test coverage for the new from_model functionality

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.

File Description
src/lighteval/models/transformers/transformers_model.py Refactored from_model method to require config parameter and only accept HuggingFace models
src/lighteval/pipeline.py Updated model initialization logic to handle the new from_model signature
tests/models/test_transformers_model.py Added new test class to verify from_model functionality with comprehensive attribute testing

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.


if is_accelerate_available():
model_size, _ = calculate_maximum_sizes(self.model)
model_size = convert_bytes(model_size)
Copy link

Copilot AI Aug 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The precision attribute is no longer being set in the from_model method, but it may be used elsewhere in the class. This could cause AttributeError if the attribute is accessed later.

Copilot uses AI. Check for mistakes.

Copy link
Member

@clefourrier clefourrier left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can remove self.precision from all models I think, never used (re copilot's message)

@NathanHB NathanHB merged commit 4d2bf42 into main Aug 18, 2025
5 checks passed
@NathanHB NathanHB added the bug label Sep 15, 2025
NathanHB added a commit that referenced this pull request Sep 19, 2025
* fixes from_model function and adds tests

* removes mock accelerator from tests

* fixes tests by adding model_configs where needed

* Apply suggestion from @Copilot

Co-authored-by: Copilot <[email protected]>

* fixes from review

* fixes from review

* fixes from review

* fixes from review

---------

Co-authored-by: Copilot <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants