Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installation stuck on windows #1703

Open
4 tasks done
thewh1teagle opened this issue Aug 24, 2024 · 11 comments
Open
4 tasks done

Installation stuck on windows #1703

thewh1teagle opened this issue Aug 24, 2024 · 11 comments

Comments

@thewh1teagle
Copy link

thewh1teagle commented Aug 24, 2024

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

Should build fast without stuck

Current Behavior

It stuck when building

Environment and Context

Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.

  • Physical (or virtual) hardware you are using, e.g. for Linux:

amd ryzen 5 4500u
windows 11

  • SDK version, e.g. for Linux:
python 3.9.0
cmake 3.30.1
MSVC

Failure Information (for bugs)

(venv) PS D:\sherpa-talk> $env:VULKAN_SDK = "C:\VulkanSDK\1.3.290.0"
(venv) PS D:\sherpa-talk> pip install .\llama-cpp-python\
Processing d:\sherpa-talk\llama-cpp-python
  Installing build dependencies ... canceled
ERROR: Operation cancelled by user
(venv) PS D:\sherpa-talk> pip install .\llama-cpp-python\ --verbose
Using pip 24.2 from d:\sherpa-talk\venv\lib\site-packages\pip (python 3.9)
Processing d:\sherpa-talk\llama-cpp-python
  Running command pip subprocess to install build dependencies
  Using pip 24.2 from D:\sherpa-talk\venv\Lib\site-packages\pip (python 3.9)
  Collecting scikit-build-core>=0.9.2 (from scikit-build-core[pyproject]>=0.9.2)
    Obtaining dependency information for scikit-build-core>=0.9.2 from https://files.pythonhosted.org/packages/20/f0/11b0f09173051647af2e140f68f3d94432c5b41a6ea0d45a43e38ab68192/scikit_build_core-0.10.5-py3-none-any.whl.metadata
    Using cached scikit_build_core-0.10.5-py3-none-any.whl.metadata (20 kB)
  Collecting exceptiongroup>=1.0 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
    Obtaining dependency information for exceptiongroup>=1.0 from https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl.metadata
    Using cached exceptiongroup-1.2.2-py3-none-any.whl.metadata (6.6 kB)
  Collecting packaging>=21.3 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
    Obtaining dependency information for packaging>=21.3 from https://files.pythonhosted.org/packages/08/aa/cc0199a5f0ad350994d660967a8efb233fe0416e4639146c089643407ce6/packaging-24.1-py3-none-any.whl.metadata
    Using cached packaging-24.1-py3-none-any.whl.metadata (3.2 kB)
  Collecting pathspec>=0.10.1 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
    Obtaining dependency information for pathspec>=0.10.1 from https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl.metadata
    Using cached pathspec-0.12.1-py3-none-any.whl.metadata (21 kB)
  Collecting tomli>=1.2.2 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
    Obtaining dependency information for tomli>=1.2.2 from https://files.pythonhosted.org/packages/97/75/10a9ebee3fd790d20926a90a2547f0bf78f371b2f13aa822c759680ca7b9/tomli-2.0.1-py3-none-any.whl.metadata
    Using cached tomli-2.0.1-py3-none-any.whl.metadata (8.9 kB)
  Using cached scikit_build_core-0.10.5-py3-none-any.whl (164 kB)
  Using cached exceptiongroup-1.2.2-py3-none-any.whl (16 kB)
  Using cached packaging-24.1-py3-none-any.whl (53 kB)
  Using cached pathspec-0.12.1-py3-none-any.whl (31 kB)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
  Installing collected packages: tomli, pathspec, packaging, exceptiongroup, scikit-build-core
  Successfully installed exceptiongroup-1.2.2 packaging-24.1 pathspec-0.12.1 scikit-build-core-0.10.5 tomli-2.0.1
  Installing build dependencies ... done
  Running command Getting requirements to build wheel
  Getting requirements to build wheel ... done
  Running command Preparing metadata (pyproject.toml)
  *** scikit-build-core 0.10.5 using CMake 3.30.1 (metadata_wheel)
  Preparing metadata (pyproject.toml) ... done
Collecting typing-extensions>=4.5.0 (from llama_cpp_python==0.2.89)
  Obtaining dependency information for typing-extensions>=4.5.0 from https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl.metadata
  Using cached typing_extensions-4.12.2-py3-none-any.whl.metadata (3.0 kB)
  Link requires a different Python (3.9.0 not in: '>=3.10'): https://files.pythonhosted.org/packages/d1/0f/8d2b5ebb01dc49d20ae0a282d6baff7202b7bf0df8acdd4a6abeffe98070/numpy-2.1.0rc1.tar.gz (from https://pypi.org/simple/numpy/) (requires-python:>=3.10)
  Link requires a different Python (3.9.0 not in: '>=3.10'): https://files.pythonhosted.org/packages/54/a4/f8188c4f3e07f7737683588210c073478abcb542048cf4ab6fedad0b458a/numpy-2.1.0.tar.gz (from https://pypi.org/simple/numpy/) (requires-python:>=3.10)     
Collecting numpy>=1.20.0 (from llama_cpp_python==0.2.89)
  Obtaining dependency information for numpy>=1.20.0 from https://files.pythonhosted.org/packages/52/87/bb45780eb4b9ed1e4710c2f2b42ed7224071aef6f08152f2520df0ec2ee5/numpy-2.0.1-cp39-cp39-win_amd64.whl.metadata
  Using cached numpy-2.0.1-cp39-cp39-win_amd64.whl.metadata (60 kB)
Collecting diskcache>=5.6.1 (from llama_cpp_python==0.2.89)
  Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata
  Using cached diskcache-5.6.3-py3-none-any.whl.metadata (20 kB)
Collecting jinja2>=2.11.3 (from llama_cpp_python==0.2.89)
  Obtaining dependency information for jinja2>=2.11.3 from https://files.pythonhosted.org/packages/31/80/3a54838c3fb461f6fec263ebf3a3a41771bd05190238de3486aae8540c36/jinja2-3.1.4-py3-none-any.whl.metadata
  Using cached jinja2-3.1.4-py3-none-any.whl.metadata (2.6 kB)
Collecting MarkupSafe>=2.0 (from jinja2>=2.11.3->llama_cpp_python==0.2.89)
  Obtaining dependency information for MarkupSafe>=2.0 from https://files.pythonhosted.org/packages/f6/f8/4da07de16f10551ca1f640c92b5f316f9394088b183c6a57183df6de5ae4/MarkupSafe-2.1.5-cp39-cp39-win_amd64.whl.metadata
  Using cached MarkupSafe-2.1.5-cp39-cp39-win_amd64.whl.metadata (3.1 kB)
Using cached diskcache-5.6.3-py3-none-any.whl (45 kB)
Using cached jinja2-3.1.4-py3-none-any.whl (133 kB)
Using cached numpy-2.0.1-cp39-cp39-win_amd64.whl (16.6 MB)
Using cached typing_extensions-4.12.2-py3-none-any.whl (37 kB)
Using cached MarkupSafe-2.1.5-cp39-cp39-win_amd64.whl (17 kB)
Building wheels for collected packages: llama_cpp_python
  Running command Building wheel for llama_cpp_python (pyproject.toml)
  *** scikit-build-core 0.10.5 using CMake 3.30.1 (wheel)
  *** Configuring CMake...
  2024-08-24 20:59:52,528 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None
  loading initial cache file C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\CMakeInit.txt
  -- Building for: Visual Studio 17 2022
  -- Selecting Windows SDK version 10.0.22000.0 to target Windows 10.0.22631.
  -- The C compiler identification is MSVC 19.40.33812.0
  -- The CXX compiler identification is MSVC 19.40.33812.0
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.45.2.windows.1")
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
  -- Looking for pthread_create in pthreads
  -- Looking for pthread_create in pthreads - not found
  -- Looking for pthread_create in pthread
  -- Looking for pthread_create in pthread - not found
  -- Found Threads: TRUE
  -- Found OpenMP_C: -openmp (found version "2.0")
  -- Found OpenMP_CXX: -openmp (found version "2.0")
  -- Found OpenMP: TRUE (found version "2.0")
  -- OpenMP found
  -- Using llamafile
  -- Found Vulkan: C:/VulkanSDK/1.3.290.0/Lib/vulkan-1.lib (found version "1.3.290") found components: glslc glslangValidator
  -- Vulkan found
  -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
  -- CMAKE_SYSTEM_PROCESSOR: AMD64
  -- CMAKE_GENERATOR_PLATFORM: x64
  -- x86 detected
  -- Performing Test HAS_AVX_1
  -- Performing Test HAS_AVX_1 - Success
  -- Performing Test HAS_AVX2_1
  -- Performing Test HAS_AVX2_1 - Success
  -- Performing Test HAS_FMA_1
  -- Performing Test HAS_FMA_1 - Success
  -- Performing Test HAS_AVX512_1
  -- Performing Test HAS_AVX512_1 - Failed
  -- Performing Test HAS_AVX512_2
  -- Performing Test HAS_AVX512_2 - Failed
  CMake Warning (dev) at CMakeLists.txt:9 (install):
    Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  Call Stack (most recent call first):
    CMakeLists.txt:73 (llama_cpp_python_install_target)
  This warning is for project developers.  Use -Wno-dev to suppress it.

  CMake Warning (dev) at CMakeLists.txt:17 (install):
    Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  Call Stack (most recent call first):
    CMakeLists.txt:73 (llama_cpp_python_install_target)
  This warning is for project developers.  Use -Wno-dev to suppress it.

  CMake Warning (dev) at CMakeLists.txt:9 (install):
    Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  Call Stack (most recent call first):
    CMakeLists.txt:74 (llama_cpp_python_install_target)
  This warning is for project developers.  Use -Wno-dev to suppress it.

  CMake Warning (dev) at CMakeLists.txt:17 (install):
    Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  Call Stack (most recent call first):
    CMakeLists.txt:74 (llama_cpp_python_install_target)
  This warning is for project developers.  Use -Wno-dev to suppress it.

  -- Configuring done (13.0s)
  -- Generating done (0.2s)
  -- Build files have been written to: C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build
  *** Building project with Visual Studio 17 2022...
  Change Dir: 'C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build'

  Run Build Command(s): "C:/Program Files/Microsoft Visual Studio/2022/Community/MSBuild/Current/Bin/amd64/MSBuild.exe" ALL_BUILD.vcxproj /p:Configuration=Release /p:Platform=x64 /p:VisualStudioVersion=17.0 /v:n
  MSBuild version 17.10.4+10fbfbf2e for .NET Framework
  Build started 24/08/2024 21:00:06.

  Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\ALL_BUILD.vcxproj" on node 1 (default targets).
  Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\ZERO_CHECK.vcxproj" (2) on node 1 (default targets).
  PrepareForBuild:
    Creating directory "x64\Release\ZERO_CHECK\".
  C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\ZERO_CHECK.vcxproj]
    Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
    Creating directory "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\".
  InitializeBuildStatus:
    Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
    Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
  CustomBuild:
    1>Checking Build System
  FinalizeBuildStatus:
    Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
    Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate".
  Done Building Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\ZERO_CHECK.vcxproj" (default targets).
  Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\common\build_info.vcxproj" (3) on node 1 (default targets).
  PrepareForBuild:
    Creating directory "build_info.dir\Release\".
  C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\common\build_info.vcxproj]
    Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
    Creating directory "build_info.dir\Release\build_info.tlog\".
  InitializeBuildStatus:
    Creating "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
    Touching "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
  CustomBuild:
    Generating build details from Git
    -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.45.2.windows.1")
    Building Custom Rule D:/sherpa-talk/llama-cpp-python/vendor/llama.cpp/common/CMakeLists.txt
  ClCompile:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"build_info.dir\Release\\" /Fd"build_info.dir\Release\build_info.pdb" /external:W1 /Gd /TP /errorReport:queue "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\common\build-info.cpp"      
    build-info.cpp
  Lib:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\Lib.exe /OUT:"build_info.dir\Release\build_info.lib" /NOLOGO /MACHINE:X64  /machine:x64 "build_info.dir\Release\build-info.obj"
    build_info.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\common\build_info.dir\Release\build_info.lib
  FinalizeBuildStatus:
    Deleting file "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
    Touching "build_info.dir\Release\build_info.tlog\build_info.lastbuildstate".
  Done Building Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\common\build_info.vcxproj" (default targets).
  Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) on node 1 (default targets).
  Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) on node 1 (default targets).     
  Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) is building "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\vulkan-shaders\vulkan-shaders-gen.vcxproj" (6) on node 1 (default targets).
  PrepareForBuild:
    Creating directory "vulkan-shaders-gen.dir\Release\".
  C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\vulkan-shaders\vulkan-shaders-gen.vcxproj]
    Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
    Creating directory "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\bin\Release\".
    Creating directory "vulkan-shaders-gen.dir\Release\vulkan-s.088A11F6.tlog\".
  InitializeBuildStatus:
    Creating "vulkan-shaders-gen.dir\Release\vulkan-s.088A11F6.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
    Touching "vulkan-shaders-gen.dir\Release\vulkan-s.088A11F6.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule D:/sherpa-talk/llama-cpp-python/vendor/llama.cpp/ggml/src/vulkan-shaders/CMakeLists.txt
  ClCompile:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"vulkan-shaders-gen.dir\Release\\" /Fd"vulkan-shaders-gen.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\vulkan-shaders\vulkan-shaders-gen.cpp"
    vulkan-shaders-gen.cpp
  MakeDirsForLink:
    Creating directory "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\vulkan-shaders\Release\".
  Link:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\bin\Release\vulkan-shaders-gen.exe" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/bin/Release/vulkan-shaders-gen.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/vendor/llama.cpp/ggml/src/vulkan-shaders/Release/vulkan-shaders-gen.lib" /MACHINE:X64  /machine:x64 "vulkan-shaders-gen.dir\Release\vulkan-shaders-gen.obj"
    vulkan-shaders-gen.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\bin\Release\vulkan-shaders-gen.exe
  FinalizeBuildStatus:
    Deleting file "vulkan-shaders-gen.dir\Release\vulkan-s.088A11F6.tlog\unsuccessfulbuild".
    Touching "vulkan-shaders-gen.dir\Release\vulkan-s.088A11F6.tlog\vulkan-shaders-gen.lastbuildstate".
  Done Building Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\vulkan-shaders\vulkan-shaders-gen.vcxproj" (default targets).
  C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
    Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
    Creating directory "ggml.dir\Release\ggml.tlog\".
  InitializeBuildStatus:
    Creating "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
    Touching "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
  CustomBuild:
    Generate vulkan shaders
    ggml_vulkan: Generating and compiling shaders to SPIR-V
    Building Custom Rule D:/sherpa-talk/llama-cpp-python/vendor/llama.cpp/ggml/src/CMakeLists.txt
  ClCompile:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\CL.exe /c /I"D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\..\include" /I"D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\." /IC:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_USE_VULKAN /D GGML_SHARED /D GGML_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_EXPORTS /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c11 /Fo"ggml.dir\Release\\" /Fd"ggml.dir\Release\vc143.pdb" /external:W0 /Gd /TC /errorReport:queue  /external:I "C:/VulkanSDK/1.3.290.0/Include" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml.c" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-alloc.c" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-backend.c" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-quants.c" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-aarch64.c"
    ggml.c
    ggml-alloc.c
    ggml-backend.c
    ggml-quants.c
    ggml-aarch64.c
    Generating Code...
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\CL.exe /c /I"D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\..\include" /I"D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\." /IC:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_USE_VULKAN /D GGML_SHARED /D GGML_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_EXPORTS /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /Fo"ggml.dir\Release\\" /Fd"ggml.dir\Release\vc143.pdb" /external:W0 /Gd /TP /errorReport:queue  /external:I "C:/VulkanSDK/1.3.290.0/Include" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-vulkan.cpp" "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\ggml-vulkan-shaders.cpp" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\llamafile\sgemm.cpp"
    ggml-vulkan.cpp
  D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-vulkan.cpp(2056,17): warning C4297: 'ggml_vk_instance_init': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
        D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-vulkan.cpp(2056,17):
        __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function

  D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-vulkan.cpp(6710,38): warning C4477: 'snprintf' : format string '%ld' requires an argument of type 'long', but variadic argument 2 has type 'size_t' [C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
        D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\ggml-vulkan.cpp(6710,38):
        consider using '%zd' in the format string

    ggml-vulkan-shaders.cpp
    sgemm.cpp
    Generating Code...
  MakeDirsForLink:
    Creating directory "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\Release\".
  PreLinkEvent:
    Auto build dll exports
    setlocal
    cd C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src
    if %errorlevel% neq 0 goto :cmEnd
    C:
    if %errorlevel% neq 0 goto :cmEnd
    "C:\Program Files\CMake\bin\cmake.exe" -E __create_def C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/vendor/llama.cpp/ggml/src/ggml.dir/Release/exports.def C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/vendor/llama.cpp/ggml/src/ggml.dir/Release//objects.txt
    if %errorlevel% neq 0 goto :cmEnd
    :cmEnd
    endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone
    :cmErrorLevel
    exit /b %1
    :cmDone
    if %errorlevel% neq 0 goto :VCEnd
    :VCEnd
  Link:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\bin\Release\ggml.dll" /INCREMENTAL:NO /NOLOGO "C:\VulkanSDK\1.3.290.0\Lib\vulkan-1.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /DEF:"C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/vendor/llama.cpp/ggml/src/ggml.dir/Release/exports.def" /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/bin/Release/ggml.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/vendor/llama.cpp/ggml/src/Release/ggml.lib" /MACHINE:X64  /machine:x64 /DLL ggml.dir\Release\ggml.obj
    "ggml.dir\Release\ggml-alloc.obj"
    "ggml.dir\Release\ggml-backend.obj"
    "ggml.dir\Release\ggml-quants.obj"
    "ggml.dir\Release\ggml-vulkan.obj"
    "ggml.dir\Release\ggml-vulkan-shaders.obj"
    ggml.dir\Release\sgemm.obj
    "ggml.dir\Release\ggml-aarch64.obj"
       Creating library C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/vendor/llama.cpp/ggml/src/Release/ggml.lib and object C:/Users/User/AppData/Local/Temp/tmpgt6us9y0/build/vendor/llama.cpp/ggml/src/Release/ggml.exp
    ggml.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\bin\Release\ggml.dll
  FinalizeBuildStatus:
    Deleting file "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
    Touching "ggml.dir\Release\ggml.tlog\ggml.lastbuildstate".
  Done Building Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default targets).
  Project "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\src\llama.vcxproj" (7) on node 1 (default targets).
  C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpgt6us9y0\build\vendor\llama.cpp\src\llama.vcxproj]
    Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
    Creating directory "llama.dir\Release\llama.tlog\".
  InitializeBuildStatus:
    Creating "llama.dir\Release\llama.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
    Touching "llama.dir\Release\llama.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule D:/sherpa-talk/llama-cpp-python/vendor/llama.cpp/src/CMakeLists.txt
  ClCompile:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\CL.exe /c /I"D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\src\." /I"D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\src\..\include" /I"D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_USE_VULKAN /D "CMAKE_INTDIR=\"Release\"" /D llama_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama.dir\Release\\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\src\llama.cpp" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\src\llama-vocab.cpp" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\src\llama-grammar.cpp" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\src\llama-sampling.cpp" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\src\unicode.cpp" "D:\sherpa-talk\llama-cpp-python\vendor\llama.cpp\src\unicode-data.cpp"
    llama.cpp
    llama-vocab.cpp
    llama-grammar.cpp
    llama-sampling.cpp
    unicode.cpp
    unicode-data.cpp
    Generating Code...

Steps to Reproduce

$env:VULKAN_SDK = "C:\VulkanSDK\1.3.290.0"
$env:CMAKE_ARGS="-DGGML_VULKAN=ON" 
git clone https://github.com/abetlen/llama-cpp-python
pip install .\llama-cpp-python

https://github.com/ggerganov/llama.cpp/tree/1731d4238f9e4f925a750810e7f5480827c66dcf

With original llama.cpp it builds fast without errors with vulkan.

If more information is needed let me know.
By the way I think that the issue template should be improved... it has too many things and it's not simplified.

Happens also without vulkan

@Shyryp
Copy link

Shyryp commented Aug 25, 2024

I have the same problem, but during the upgrade:
.\python.exe -m pip install --upgrade --force-reinstall --no-cache-dir --verbose llama-cpp-python

Previously (two months ago) the installation was without problems.

@pavanrang
Copy link

you can try to install a pre-built wheel with basic CPU support.

pip install llama-cpp-python
--extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu

@thewh1teagle
Copy link
Author

It's stuck exactly here:

 with incremental build. [C:\Users\User\AppData\Local\Temp\tmpij2flo7g\build\vendor\llama.cpp\src\llama.vcxproj]
    Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
    Creating directory "llama.dir\Release\llama.tlog\".
  InitializeBuildStatus:
    Creating "llama.dir\Release\llama.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
    Touching "llama.dir\Release\llama.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-req-build-xtbmoz6p/vendor/llama.cpp/src/CMakeLists.txt
  ClCompile:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\." /I"C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\..\include" /I"C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_USE_VULKAN /D "CMAKE_INTDIR=\"Release\"" /D llama_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama.dir\Release\\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\llama.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\llama-vocab.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\llama-grammar.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\llama-sampling.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\unicode.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\unicode-data.cpp"
    llama.cpp
    llama-vocab.cpp
    llama-grammar.cpp
    llama-sampling.cpp
    unicode.cpp
    unicode-data.cpp
    Generating Code...

@Shyryp
Copy link

Shyryp commented Aug 29, 2024

It's stuck exactly here:

 with incremental build. [C:\Users\User\AppData\Local\Temp\tmpij2flo7g\build\vendor\llama.cpp\src\llama.vcxproj]
    Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
    Creating directory "llama.dir\Release\llama.tlog\".
  InitializeBuildStatus:
    Creating "llama.dir\Release\llama.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
    Touching "llama.dir\Release\llama.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-req-build-xtbmoz6p/vendor/llama.cpp/src/CMakeLists.txt
  ClCompile:
    C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.40.33807\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\." /I"C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\..\include" /I"C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\ggml\src\..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_USE_VULKAN /D "CMAKE_INTDIR=\"Release\"" /D llama_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama.dir\Release\\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\llama.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\llama-vocab.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\llama-grammar.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\llama-sampling.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\unicode.cpp" "C:\Users\User\AppData\Local\Temp\pip-req-build-xtbmoz6p\vendor\llama.cpp\src\unicode-data.cpp"
    llama.cpp
    llama-vocab.cpp
    llama-grammar.cpp
    llama-sampling.cpp
    unicode.cpp
    unicode-data.cpp
    Generating Code...

@thewh1teagle Just wait. It can take from 10 minutes to an hour (depending on your PC configuration).

@thewh1teagle
Copy link
Author

@Shyryp
llama.cpp build in 1 minute. Do you know why with Python it should build in 10 minutes?

@Shyryp
Copy link

Shyryp commented Aug 29, 2024

@thewh1teagle Sorry, I'm not aware of why this might be happening. I haven't looked into it in depth.
Hopefully the creators/developers of llama-cpp-python will look into this issue or explain what's going on.

@abetlen
Copy link
Owner

abetlen commented Aug 29, 2024

@thewh1teagle can you try re-building with --verbose to get an idea of what's being compiled.

Additionally, when building llama.cpp can you post your full logs and time to build (from a clean repo).

@thewh1teagle
Copy link
Author

thewh1teagle commented Aug 29, 2024

can you try re-building with --verbose to get an idea of what's being compiled.

Additionally, when building llama.cpp can you post your full logs and time to build (from a clean repo).

Turns out that it happens in both llama-cpp-python and llama.cpp in Release mode I thought that it doesn't happen in llama.cpp because I compiled it with default mode.

Not sure why in debug mode it compile fast but in release it stuck forever on this

ggerganov/llama.cpp#9242

@abetlen
Copy link
Owner

abetlen commented Aug 29, 2024

Linking #1714 as it seems to be the same issue. Seems to be aggressive link-time optimization by msvc

@thewh1teagle
Copy link
Author

thewh1teagle commented Aug 29, 2024

Linking #1714 as it seems to be the same issue. Seems to be aggressive link-time optimization by msvc

How can I disable it in release mode to verify that this is the cause? (cli)

Update

Using the following:

cmake -B build . -DCMAKE_BUILD_TYPE=Release -DCMAKE_CXX_FLAGS="/Od"
cmake --build build --config Release --target llama-cli

fixed the issue.
It compiled in 116 seconds.

@chozillla
Copy link

Can you provide the code for python CLI? I am having some issues with the build taking a long time!
` set CMAKE_ARGS=-DLLAMA_CUBLAS=on

pip install llama-cpp-python --verbose
Using pip 24.2 from C:\ProgramData\miniconda3\Lib\site-packages\pip (python 3.12)
Defaulting to user installation because normal site-packages is not writeable
Collecting llama-cpp-python
Using cached llama_cpp_python-0.3.1.tar.gz (63.9 MB)
Running command pip subprocess to install build dependencies
Using pip 24.2 from C:\ProgramData\miniconda3\Lib\site-packages\pip (python 3.12)
Collecting scikit-build-core>=0.9.2 (from scikit-build-core[pyproject]>=0.9.2)
Obtaining dependency information for scikit-build-core>=0.9.2 from https://files.pythonhosted.org/packages/88/fe/90476c4f6a1b2f922efa00d26e876dd40c7279e28ec18f08f0851ad21ba6/scikit_build_core-0.10.7-py3-none-any.whl.metadata
Using cached scikit_build_core-0.10.7-py3-none-any.whl.metadata (21 kB)
Collecting packaging>=21.3 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
Obtaining dependency information for packaging>=21.3 from https://files.pythonhosted.org/packages/08/aa/cc0199a5f0ad350994d660967a8efb233fe0416e4639146c089643407ce6/packaging-24.1-py3-none-any.whl.metadata
Using cached packaging-24.1-py3-none-any.whl.metadata (3.2 kB)
Collecting pathspec>=0.10.1 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2)
Obtaining dependency information for pathspec>=0.10.1 from https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl.metadata
Using cached pathspec-0.12.1-py3-none-any.whl.metadata (21 kB)
Using cached scikit_build_core-0.10.7-py3-none-any.whl (165 kB)
Using cached packaging-24.1-py3-none-any.whl (53 kB)
Using cached pathspec-0.12.1-py3-none-any.whl (31 kB)
Installing collected packages: pathspec, packaging, scikit-build-core
Successfully installed packaging-24.1 pathspec-0.12.1 scikit-build-core-0.10.7
Installing build dependencies ... done
Running command Getting requirements to build wheel
Getting requirements to build wheel ... done
Running command pip subprocess to install backend dependencies
Using pip 24.2 from C:\ProgramData\miniconda3\Lib\site-packages\pip (python 3.12)
Collecting cmake>=3.21
Obtaining dependency information for cmake>=3.21 from https://files.pythonhosted.org/packages/74/ed/6624adba772329c9c8a9765ee2ee06df7082ad7ac75cf33200d405c4c45f/cmake-3.30.5-py3-none-win_amd64.whl.metadata
Using cached cmake-3.30.5-py3-none-win_amd64.whl.metadata (6.4 kB)
Using cached cmake-3.30.5-py3-none-win_amd64.whl (35.6 MB)
Installing collected packages: cmake
Creating C:\Users\poko-\AppData\Local\Temp\pip-build-env-3br2ukd7\normal\Scripts
Successfully installed cmake-3.30.5
Installing backend dependencies ... done
Running command Preparing metadata (pyproject.toml)
*** scikit-build-core 0.10.7 using CMake 3.30.5 (metadata_wheel)
Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: typing-extensions>=4.5.0 in c:\users\poko-\appdata\roaming\python\python312\site-packages (from llama-cpp-python) (4.12.2)
Requirement already satisfied: numpy>=1.20.0 in c:\users\poko-\appdata\roaming\python\python312\site-packages (from llama-cpp-python) (1.26.4)
Collecting diskcache>=5.6.1 (from llama-cpp-python)
Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata
Using cached diskcache-5.6.3-py3-none-any.whl.metadata (20 kB)
Requirement already satisfied: jinja2>=2.11.3 in c:\users\poko-\appdata\roaming\python\python312\site-packages (from llama-cpp-python) (3.1.4)
Requirement already satisfied: MarkupSafe>=2.0 in c:\users\poko-\appdata\roaming\python\python312\site-packages (from jinja2>=2.11.3->llama-cpp-python) (3.0.1)
Using cached diskcache-5.6.3-py3-none-any.whl (45 kB)
Building wheels for collected packages: llama-cpp-python
Running command Building wheel for llama-cpp-python (pyproject.toml)
*** scikit-build-core 0.10.7 using CMake 3.30.5 (wheel)
*** Configuring CMake...
2024-10-23 11:00:16,530 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None
loading initial cache file C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\CMakeInit.txt
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.22631.
-- The C compiler identification is MSVC 19.41.34123.0
-- The CXX compiler identification is MSVC 19.41.34123.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.41.34120/bin/Hostx64/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.41.34120/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.47.0.windows.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- Found OpenMP_C: -openmp (found version "2.0")
-- Found OpenMP_CXX: -openmp (found version "2.0")
-- Found OpenMP: TRUE (found version "2.0")
-- OpenMP found
-- Using llamafile
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- CMAKE_GENERATOR_PLATFORM: x64
-- x86 detected
-- Performing Test HAS_AVX_1
-- Performing Test HAS_AVX_1 - Success
-- Performing Test HAS_AVX2_1
-- Performing Test HAS_AVX2_1 - Success
-- Performing Test HAS_FMA_1
-- Performing Test HAS_FMA_1 - Success
-- Performing Test HAS_AVX512_1
-- Performing Test HAS_AVX512_1 - Success
CMake Warning (dev) at CMakeLists.txt:9 (install):
Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
Call Stack (most recent call first):
CMakeLists.txt:73 (llama_cpp_python_install_target)
This warning is for project developers. Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:17 (install):
Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
Call Stack (most recent call first):
CMakeLists.txt:73 (llama_cpp_python_install_target)
This warning is for project developers. Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:9 (install):
Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
Call Stack (most recent call first):
CMakeLists.txt:74 (llama_cpp_python_install_target)
This warning is for project developers. Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:17 (install):
Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
Call Stack (most recent call first):
CMakeLists.txt:74 (llama_cpp_python_install_target)
This warning is for project developers. Use -Wno-dev to suppress it.

-- Configuring done (12.4s)
-- Generating done (0.1s)
-- Build files have been written to: C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build
*** Building project with Visual Studio 17 2022...
Change Dir: 'C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build'

Run Build Command(s): "C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/MSBuild/Current/Bin/amd64/MSBuild.exe" ALL_BUILD.vcxproj /p:Configuration=Release /p:Platform=x64 /p:VisualStudioVersion=17.0 /v:n
MSBuild version 17.11.9+a69bbaaf5 for .NET Framework
Build started 23/10/2024 11.00.29.

Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\ALL_BUILD.vcxproj" on node 1 (default targets).
Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\ZERO_CHECK.vcxproj" (2) on node 1 (default targets).
PrepareForBuild:
Creating directory "x64\Release\ZERO_CHECK".
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\ZERO_CHECK.vcxproj]
Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
Creating directory "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog".
InitializeBuildStatus:
Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
CustomBuild:
1>Checking Build System
FinalizeBuildStatus:
Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate".
Done Building Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\ZERO_CHECK.vcxproj" (default targets).
Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\common\build_info.vcxproj" (3) on node 1 (default targets).
PrepareForBuild:
Creating directory "build_info.dir\Release".
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\common\build_info.vcxproj]
Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
Creating directory "build_info.dir\Release\build_info.tlog".
InitializeBuildStatus:
Creating "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
Touching "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
CustomBuild:
Generating build details from Git
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.47.0.windows.1")
Building Custom Rule C:/Users/poko-/AppData/Local/Temp/pip-install-i0qazzsi/llama-cpp-python_41fdea811ec3461c9539d88b683664a6/vendor/llama.cpp/common/CMakeLists.txt
ClCompile:
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR="Release"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"build_info.dir\Release\" /Fd"build_info.dir\Release\build_info.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\common\build-info.cpp"
build-info.cpp
Lib:
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\Lib.exe /OUT:"build_info.dir\Release\build_info.lib" /NOLOGO /MACHINE:X64 /machine:x64 "build_info.dir\Release\build-info.obj"
build_info.vcxproj -> C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\common\build_info.dir\Release\build_info.lib
FinalizeBuildStatus:
Deleting file "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
Touching "build_info.dir\Release\build_info.tlog\build_info.lastbuildstate".
Done Building Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\common\build_info.vcxproj" (default targets).
Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) on node 1 (default targets).
Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) on node 1 (default targets).
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
Creating directory "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\bin\Release".
Creating directory "ggml.dir\Release\ggml.tlog".
InitializeBuildStatus:
Creating "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
Touching "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
CustomBuild:
Building Custom Rule C:/Users/poko-/AppData/Local/Temp/pip-install-i0qazzsi/llama-cpp-python_41fdea811ec3461c9539d88b683664a6/vendor/llama.cpp/ggml/src/CMakeLists.txt
ClCompile:
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src..\include" /I"C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_SHARED /D GGML_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR="Release"" /D ggml_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c11 /Fo"ggml.dir\Release\" /Fd"ggml.dir\Release\vc143.pdb" /external:W1 /Gd /TC /errorReport:queue "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\ggml.c" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\ggml-alloc.c" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\ggml-backend.c" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\ggml-quants.c" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\ggml-aarch64.c"
ggml.c
ggml-alloc.c
ggml-backend.c
ggml-quants.c
C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
(compiling source file '../../../../../../pip-install-i0qazzsi/llama-cpp-python_41fdea811ec3461c9539d88b683664a6/vendor/llama.cpp/ggml/src/ggml-quants.c')
C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\ggml-common.h(62,9):
see previous definition of 'static_assert'

ggml-aarch64.c

C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]
(compiling source file '../../../../../../pip-install-i0qazzsi/llama-cpp-python_41fdea811ec3461c9539d88b683664a6/vendor/llama.cpp/ggml/src/ggml-aarch64.c')
C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\ggml-common.h(62,9):
see previous definition of 'static_assert'

Generating Code...
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\..\include" /I"C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_SHARED /D GGML_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_EXPORTS /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /Fo"ggml.dir\Release\\" /Fd"ggml.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src\llamafile\sgemm.cpp"
sgemm.cpp

MakeDirsForLink:
Creating directory "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\ggml\src\Release".
PreLinkEvent:
Auto build dll exports
setlocal
cd C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\ggml\src
if %errorlevel% neq 0 goto :cmEnd
C:
if %errorlevel% neq 0 goto :cmEnd
C:\Users\poko-\AppData\Local\Temp\pip-build-env-3br2ukd7\normal\Lib\site-packages\cmake\data\bin\cmake.exe -E __create_def C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build/vendor/llama.cpp/ggml/src/ggml.dir/Release/exports.def C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build/vendor/llama.cpp/ggml/src/ggml.dir/Release//objects.txt
if %errorlevel% neq 0 goto :cmEnd
:cmEnd
endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone
:cmErrorLevel
exit /b %1
:cmDone
if %errorlevel% neq 0 goto :VCEnd
:VCEnd
Link:
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\bin\Release\ggml.dll" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /DEF:"C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build/vendor/llama.cpp/ggml/src/ggml.dir/Release/exports.def" /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build/bin/Release/ggml.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build/vendor/llama.cpp/ggml/src/Release/ggml.lib" /MACHINE:X64 /machine:x64 /DLL ggml.dir\Release\ggml.obj
"ggml.dir\Release\ggml-alloc.obj"
"ggml.dir\Release\ggml-backend.obj"
"ggml.dir\Release\ggml-quants.obj"
ggml.dir\Release\sgemm.obj
"ggml.dir\Release\ggml-aarch64.obj"
Creating library C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build/vendor/llama.cpp/ggml/src/Release/ggml.lib and object C:/Users/poko-/AppData/Local/Temp/tmpj2ijvozw/build/vendor/llama.cpp/ggml/src/Release/ggml.exp
ggml.vcxproj -> C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\bin\Release\ggml.dll
FinalizeBuildStatus:
Deleting file "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
Touching "ggml.dir\Release\ggml.tlog\ggml.lastbuildstate".
Done Building Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default targets).
Project "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\src\llama.vcxproj" (6) on node 1 (default targets).
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\poko-\AppData\Local\Temp\tmpj2ijvozw\build\vendor\llama.cpp\src\llama.vcxproj]
Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details.
Creating directory "llama.dir\Release\llama.tlog".
InitializeBuildStatus:
Creating "llama.dir\Release\llama.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
Touching "llama.dir\Release\llama.tlog\unsuccessfulbuild".
CustomBuild:
Building Custom Rule C:/Users/poko-/AppData/Local/Temp/pip-install-i0qazzsi/llama-cpp-python_41fdea811ec3461c9539d88b683664a6/vendor/llama.cpp/src/CMakeLists.txt
ClCompile:
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\src." /I"C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\src..\include" /I"C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\ggml\src..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR="Release"" /D llama_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama.dir\Release\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\src\llama.cpp" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\src\llama-vocab.cpp" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\src\llama-grammar.cpp" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\src\llama-sampling.cpp" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\src\unicode.cpp" "C:\Users\poko-\AppData\Local\Temp\pip-install-i0qazzsi\llama-cpp-python_41fdea811ec3461c9539d88b683664a6\vendor\llama.cpp\src\unicode-data.cpp"
llama.cpp
llama-vocab.cpp
llama-grammar.cpp
llama-sampling.cpp
unicode.cpp
unicode-data.cpp
Generating Code...
`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants