-
Couldn't load subscription status.
- Fork 59
Closed
Labels
llama.cpp relatedIssues related to llama.cpp upstream source code, mostly unrelated to wllamaIssues related to llama.cpp upstream source code, mostly unrelated to wllama
Description
Just a quick question: I take it this is an issue with the model? Or is there something I can do to fix this? Perhaps add the value manually?
☠️ WLLAMA: llama_model_load: error loading model:
error loading model hyperparameters:
key not found in model: phi3.attention.sliding_window
Hmm, I'm acutally pretty sure I was able to run this model in the past. Maybe something changed in llama.cpp?
I did just switch to preloading the model separately from starting it. My preload code:
let model_settings = {'allow_offline':true};
model_settings['progressCallback'] = ({ loaded, total }) => {
//console.log(`do_preload: Wllama: pre-downloading... ${Math.round(loaded/total*100)}%`);
//console.log("do_preload: Wllama: pre-downloading... percentage, loaded, total: ", Math.round(loaded/total*100) + '%', loaded, total);
if(total != 0 && loaded > 1000000){
//console.log("loaded, total: ", loaded, total);
window.wllama_update_model_download_progress(loaded / total);
}
}
await window.llama_cpp_app.downloadModel(task.download_url,model_settings);
Metadata
Metadata
Assignees
Labels
llama.cpp relatedIssues related to llama.cpp upstream source code, mostly unrelated to wllamaIssues related to llama.cpp upstream source code, mostly unrelated to wllama