chore(typing): initial ty integration#43396
chore(typing): initial ty integration#43396tarekziade wants to merge 23 commits intohuggingface:mainfrom
ty integration#43396Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
View the CircleCI Test Summary for this PR: https://huggingface.co/spaces/transformers-community/circle-ci-viz?pr=43396&sha=6cce79 |
6cce790 to
e018467
Compare
Cyrilvallez
left a comment
There was a problem hiding this comment.
ALright! Happy to start with ty! Asked quite a few questions in comments more for my understanding and get familiar with ty than anything else
| account_id = os.getenv("TRAINING_JOB_ARN").split(":")[4] if "TRAINING_JOB_ARN" in os.environ else None | ||
| training_job_arn = os.getenv("TRAINING_JOB_ARN") | ||
| account_id = training_job_arn.split(":")[4] if training_job_arn is not None else None |
There was a problem hiding this comment.
What's the reason of ty complaining on this one?
There was a problem hiding this comment.
possibly-missing-attributebecause os.getenv can return None and ty can't know that the in os.environ check is similar. I think it's a pretty reasonable tradeoff, we remove that uncertainty by storing the getenv() value
| if hasattr(torch.backends, "fp32_precision"): | ||
| torch.backends.fp32_precision = precision_mode |
There was a problem hiding this comment.
It seem to require a lot of existence checks. Do you know how they are chosen to be required or not? Does ty base itself e.g. on your currently installed version of torch in the env? Because that may become an issue
There was a problem hiding this comment.
In this very specific case ty can't infere the existence of this attributes:
- At runtime, torch.backends module replaces itself with a GenericModule instance (line 122 in torch/backends/init.py)
- GenericModule defines fp32_precision as a ContextProp descriptor (lines 116-119)
- However, there's no .pyi stub file for torch.backends, so type checkers can't see that the attribute exists
Does ty base itself e.g. on your currently installed version of torch in the env?
yes it uses its stubs
Thanks for the first pass! one remark about the it's all the same pattern:
So it makes sense for a type checker to detect that one possible code execution pass is to call |
24c0234 to
72b6d85
Compare
86ef683 to
06ebe02
Compare
|
[For maintainers] Suggested jobs to run (before merge) run-slow: dia, whisper |
|
run-slow: dia, whisper |
|
This comment contains models: ["models/dia", "models/whisper"] |
|
supersed by #44167 |
What does this PR do?
Initial
tyintegration. To avoid a gigantic, risky patch, let's start with a baby step where we add the tooling tomake repo-checkand activate it on a subset of the repo.That gives us a human-readable patch, and allows us to get confortable with the tool before we expand it.
This patch:
tytomake repo-checkand the CIsrc/transformers/utilsBefore submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.