-
Notifications
You must be signed in to change notification settings - Fork 86
Description
Expected behavior
TVM should find the LLVM ld.lld file
Actual behavior
When running mlc_llm chat with JIT compiling on, TVM fails to find the LLVM installation, throwing
RuntimeError: cannot find ld.lld, canditates are: ['ld.lld-17.0', 'ld.lld-17', 'ld.lld', '/opt/rocm/llvm/bin']
Environment
Testing in an MLC docker container with fresh installs of nightly
Steps to reproduce
run mlc_llm chat HF://<model>, it will download the model, compile it, then crash when saving the .so file
Triage
Line 55 of https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/rocm.py incorrectly forgets to add ld.lld (or whatever it finds in the lines above) to the /opt/rocm/llvm/bin path, which then returns None since os.path.isfile in https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/utils.py#L253 returns False when pointed at directories.