Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

transformers.__spec__ returning None. Causing downstream import errors #12904

Closed
prikmm opened this issue Jul 27, 2021 · 7 comments · Fixed by #13321 or Kaggle/docker-python#1087
Closed

Comments

@prikmm
Copy link

prikmm commented Jul 27, 2021

Environment info

  • transformers version: Tried on 4.6.1(current default kaggle version)/4.8.1/4.8.2 and 4.9.1
  • Platform: Colab/Kaggle/ My Local Runtime
  • Python version: 3.7.11

Who can help

Information

This is causing downstream import errors, like right now I am not able to import lightning-flash properly as it uses __spec__ in order to find the availability of transformers.

ValueError                                Traceback (most recent call last)
<ipython-input-3-76e523923a79> in <module>
      5 print(transformers.__version__)
      6 print(transformers.__spec__)
----> 7 from flash import Trainer
      8 #from flash.core.data.utils import download_data
      9 #from flash.text import SummarizationData, SummarizationTask

/opt/conda/lib/python3.7/site-packages/flash/__init__.py in <module>
     16 
     17 from flash.__about__ import *  # noqa: F401 F403
---> 18 from flash.core.utilities.imports import _TORCH_AVAILABLE
     19 
     20 if _TORCH_AVAILABLE:

/opt/conda/lib/python3.7/site-packages/flash/core/utilities/imports.py in <module>
     75 _PYTORCHVIDEO_AVAILABLE = _module_available("pytorchvideo")
     76 _MATPLOTLIB_AVAILABLE = _module_available("matplotlib")
---> 77 _TRANSFORMERS_AVAILABLE = _module_available("transformers")
     78 _PYSTICHE_AVAILABLE = _module_available("pystiche")
     79 _FIFTYONE_AVAILABLE = _module_available("fiftyone")

/opt/conda/lib/python3.7/site-packages/flash/core/utilities/imports.py in _module_available(module_path)
     36     """
     37     try:
---> 38         return find_spec(module_path) is not None
     39     except AttributeError:
     40         # Python 3.6

/opt/conda/lib/python3.7/importlib/util.py in find_spec(name, package)
    112         else:
    113             if spec is None:
--> 114                 raise ValueError('{}.__spec__ is None'.format(name))
    115             return spec
    116 

ValueError: transformers.__spec__ is None

To reproduce


import transformers
print(transformers.__version__)
print(transformers.__spec__)

4.9.1
None

Expected behavior

Properly defined __spec__

@sgugger
Copy link
Collaborator

sgugger commented Jul 27, 2021

__spec__ is used by the Python import system internally, I am not reading anywhere that it should be defined manually by the package creators. If you have more resources about this I'm happy to look into what we could add, but a quick Google search yields nothing.

@prikmm
Copy link
Author

prikmm commented Jul 27, 2021

__spec__ is used by the Python import system internally, I am not reading anywhere that it should be defined manually by the package creators. If you have more resources about this I'm happy to look into what we could add, but a quick Google search yields nothing.

My bad, at the time of error I found this issue on tensorflow/tensorflow#30028, and thought it was the same. After reading this, I somewhat understood the the functionality of __spec__.:thumbsup:

@prikmm prikmm closed this as completed Jul 27, 2021
@laurahanu
Copy link
Contributor

@sgugger I'm also getting the same error with the latest transformers version (4.9.2) when I'm trying to use torch.hub to load a model that has transformers as a dependency. It seems that torch.hub tries to check if dependencies exist by verifying that transformers.__spec__ is not None (source code here) resulting in an error otherwise.

Before I was using an older version of transformers (3.9.2) that returned a ModuleSpec object for transformers.__spec__ so loading the same model with torch.hub worked, just wondering why this has changed and whether it should be defined?

@laurahanu
Copy link
Contributor

After investigating this further, it does seem particular to the transformers library that __spec__ returns None after importing it (other libraries still return something without having it explicitly defined).

Although it does seem that normally python's import system handles __spec__ internally and it does not need to be defined manually, it should return something automatically and not doing so could cause downstream problems e.g. when checking that dependencies exist:

Looks like the difference lies in whether transformers is manually imported or not:

In [1]: import importlib

In [2]: importlib.util.find_spec("transformers") is not None
Out[2]: True

In [3]: import transformers

In [4]: importlib.util.find_spec("transformers") is not None
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-4-6fdb35471f82> in <module>
----> 1 importlib.util.find_spec("transformers") is not None

~/opt/miniconda3/envs/pt/lib/python3.8/importlib/util.py in find_spec(name, package)
    112         else:
    113             if spec is None:
--> 114                 raise ValueError('{}.__spec__ is None'.format(name))
    115             return spec
    116

ValueError: transformers.__spec__ is None

This looks like something specific to the transformers package though, it doesn't happen e.g. with numpy:

In [5]: importlib.util.find_spec("numpy") is not None
Out[5]: True

In [6]: import numpy

In [7]: importlib.util.find_spec("numpy") is not None
Out[7]: True

@pratikchhapolika
Copy link

How to solve this issue?

@LysandreJik
Copy link
Member

This issue should be solved in transformers versions v4.10.x

@talhaanwarch
Copy link

This issue should be solved in transformers versions v4.10.x

i tried transformers-4.15.0 and error is still there

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
6 participants