-
Notifications
You must be signed in to change notification settings - Fork 120
Bump llama.cpp to b5686, fix build failures #754
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Hold off on merging this, it seems like there could be an issue with windows builds |
|
I think I fixed it. At least the windows CI passes on my fork of this repo |
|
Hmph. Actually now windows/vulkan builds are broken. Vulkan builds work on linux and macos, though - and non-vulkan builds work on all platforms I have tested, including windows. This PR is still strictly an improvment over the current |
|
b65b7de to
a92e90b
Compare
|
I found a workaround for the windows/vulkan issue, but that introduces new problems. I had pushed my windows/vulkan workaround to this branch, but I rolled it back for now. Let's just get llama.cpp bumped and the linux/vulkan stuff fixed. I'll make a separate PR for the other stuff 😇 |
|
The "separate PR" for the windows/vulkan stuff I ran in to is here: #767 |
The main motivation for bumping llama.cpp is to close #747
In the meantime, there were some changes to the build-info.cpp generation. This caused a failure during the copy operation in build.rs.
Just removing the entire copy step seems to work. So I did that.