Skip to content

Use MAGMA option for PyTorch 2+#3683

Open
Flamefire wants to merge 3 commits intoeasybuilders:developfrom
Flamefire:20250402122705_new_pr_pytorch
Open

Use MAGMA option for PyTorch 2+#3683
Flamefire wants to merge 3 commits intoeasybuilders:developfrom
Flamefire:20250402122705_new_pr_pytorch

Conversation

@Flamefire
Copy link
Contributor

@Flamefire Flamefire commented Apr 2, 2025

(created using eb --new-pr)

I found that warning while investigating PyTorch 2.6.
This was likely a merge-conflict or oversight while creating #3070 which introduced this:

  • Magma should not be used in CPU builds
  • In GPU builds it is optional

Reflect this correctly in the code. See their CMakeLists for reference: https://github.com/pytorch/pytorch/blob/40a6710ad34ae4c6f4987f0e47b5c94df3fc8ec7/cmake/Dependencies.cmake#L1692-L1706

Given that this is PyTorch I suggest to just inspect the code and don't do a full test run or at least with --skip-test-step as the change is pretty trivial

if has_rocm:
with_gpu_support = True

if pytorch_version >= 'v1.10.0':
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is the v in the version intentional? I would have thought that might trip up the version comparison, even with LooseVersion?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Flamefire my suspicion was correct:

>>> from easybuild.tools import LooseVersion
>>> v = LooseVersion('2.1.2')
>>> v >= 'v1.10.2'
False
>>> v >= LooseVersion('v1.10.2')
False
>>> v >= '1.10.2'
True

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. this seems to went unnoticed for quite some time now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants