Skip to content

[Bug] Could not convert TVM object of type runtime.Closure to a string. #906

@scottorly

Description

@scottorly

🐛 Bug

To Reproduce

Steps to reproduce the behavior:

Calling chat.reload(self.modelLib, modelPath: modelPath, appConfigJson: "")

produces the following crash:

Check failed: (IsObjectRef<tvm::runtime::String>()) is false: Could not convert TVM object of type runtime.Closure to a string.
Stack trace:
0x000000010009ecc4 tvm::runtime::detail::LogFatal::Entry::Finalize() + 68
0x000000010009ec80 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0
0x000000010009dcf4 __clang_call_terminate + 0
0x00000001000acbfc tvm::runtime::TVMArgValue::operator std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>() const + 716
0x00000001000ac318 tvm::runtime::PackedFuncValueConverter<tvm::runtime::String>::From(tvm::runtime::TVMArgValue const&) + 104
0x00000001000b5928 mlc::llm::LLMChatModule::GetFunction(tvm::runtime::String const&, tvm::runtime::ObjectPtr<tvm::runtime::Object> const&)::'lambda'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 724
0x00000001000b5648 tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(tvm::runtime::String const&, tvm::runtime::ObjectPtr<tvm::runtime::Object> const&)::'lambda'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40
tvm::runtime::TVMRetValue tvm::runtime::PackedFunc::operator()<tvm::runtime::Module&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>&>(tvm::runtime::Module&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>&) const + 260
-[ChatModule reload:modelPath:appConfigJson:] + 408

Environment

  • Platform (e.g. WebGPU/Vulkan/IOS/Android/CUDA):iOS
  • Operating system (e.g. Ubuntu/Windows/MacOS/...):MacOS
  • Device (e.g. iPhone 12 Pro, PC+RTX 3090, ...) iPhone14 Pro
  • How you installed MLC-LLM (conda, source): source
  • How you installed TVM-Unity (pip, source): pip
  • Python version (e.g. 3.10): 3.8
  • GPU driver version (if applicable):
  • CUDA/cuDNN version (if applicable):
  • TVM Unity Hash Tag (python -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))", applicable if you compile models):
  • Any other relevant information:

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugConfirmed bugs

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions