lora: make sure model keep track of associated adapters#18490
Conversation
ggerganov
left a comment
There was a problem hiding this comment.
If we delegate it to the llama_model to free the adapters, then we should remove the llama_adapter_lora_free() from the public API.
For example, with the current change I believe we will be double freeing the adapters - once on model destruction and once more here for all llama_adapter_lora_ptr instances:
Lines 23 to 26 in cd78e57
|
Yes that makes sense. By removing |
|
@ngxson This is probably good to merge? |
|
Ah yes, thanks. Merging it now |
* lora: make sure model keep track of associated adapters * deprecate llama_adapter_lora_free * minor : std::unordered_set over std::set --------- Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
Ref: #18469 (comment)
I'm extracting this change to a dedicated PR for visibility