-
Couldn't load subscription status.
- Fork 2.1k
llama-cpp: Bump to version b6565 (#28433) #28435
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
48fe9a3 to
64d8adb
Compare
|
Hi @akos-fulop-public, thanks for contributing! You need to add the new version --- a/recipes/llama-cpp/config.yml
+++ b/recipes/llama-cpp/config.yml
@@ -1,4 +1,6 @@
versions:
+ "b6529":
+ folder: "all"
"b4570":
folder: "all"
"b4079": |
7e4918d to
60ff0b0
Compare
|
Hi @franramirez688 , |
|
Previous build failed on macOS. |
|
The Windows build failed with (https://github.com/conan-io/conan-center-index/pull/28435/checks?check_run_id=51103353004): I opened a PR to the upstream with a quick fix: ggml-org/llama.cpp#16211 |
|
Thank you @uilianries , |
|
That new release fixed that error in common.cpp, but the old version is failing due missing I would suggest reverting |
5b04691 to
cb83e7e
Compare
|
Thanks for the help @uilianries ! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGMT Thank you @akos-fulop-public !! 😄
I see several changes since the last version, including new options.
In special, using git commands with ggml, but it looks safe by comparing with the Linux build here: https://c3i.jfrog.io/artifactory/cci-build-logs/cci/prod/PR-28435/7/package_build_logs/build_log_llama-cpp_b6565_87db59a20d2209f720e7aefbd727fd4d_0f2ae57cd830dd618c06cce8e80c9fc879ab3710.success.txt
For more reference:
- ggml-org/llama.cpp@b4570...b6565#diff-1e7de1ae2d059d21e1dd75d5812d5a34b0222cef273b7c3a2af62eb747f9d20a
- ggml-org/llama.cpp@b4570...b6565#diff-a77d393383a84e0dd42ebd261b80443390335377813ca160e32abf8c52effac2
- ggml-org/llama.cpp@b4570...b6565#diff-a77d393383a84e0dd42ebd261b80443390335377813ca160e32abf8c52effac2
- ggml-org/llama.cpp@b4570...b6565#diff-d4bb67eec77d05ad8d4056c50492de104bd17dc6e0b0f9919854aeeaa5002842
* llama-cpp: Bump to version b6529 (conan-io#28433) * add new llama-cpp version to config * require macOS 13 and up * llama-cpp: Bump to version b6565 (conan-io#28433) * rever conanfile changes
* llama-cpp: Bump to version b6529 (conan-io#28433) * add new llama-cpp version to config * require macOS 13 and up * llama-cpp: Bump to version b6565 (conan-io#28433) * rever conanfile changes
Summary
Changes to recipe: llama-cpp/b6565
Motivation
Request: #28433
Details
Patch no longer seems necessary, as of ggml-org/llama.cpp#14613