Skip to content
This repository has been archived by the owner on Mar 21, 2024. It is now read-only.

MSVC build fails due to cub symlink: fatal error C1083: Cannot open include file: '../config.cuh': No such file or directory #1328

Closed
bjude opened this issue Oct 24, 2020 · 13 comments · Fixed by #1701
Assignees
Labels
P0: must have Absolutely necessary. Critical issue, major blocker, etc. release: breaking change Include in "Breaking Changes" section of release notes. type: bug: functional Does not work as intended.
Milestone

Comments

@bjude
Copy link
Contributor

bjude commented Oct 24, 2020

With a freshly cloned thrust (and recursively cloned CUB) i'm getting 'file not found' errors in CUB:

  Compiling CUDA source file headers\thrust\adjacent_difference.h.cu...

  F:\code\thrust\build>"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.0\bin\nvcc.exe" -gencode=arch=compute_80
  ,code=\"compute_80,compute_80\" -gencode=arch=compute_35,code=\"sm_35,compute_35\" -gencode=arch=compute_37,code=\"sm
  _37,compute_37\" -gencode=arch=compute_50,code=\"sm_50,compute_50\" -gencode=arch=compute_52,code=\"sm_52,compute_52\
  " -gencode=arch=compute_53,code=\"sm_53,compute_53\" -gencode=arch=compute_60,code=\"sm_60,compute_60\" -gencode=arch
  =compute_61,code=\"sm_61,compute_61\" -gencode=arch=compute_62,code=\"sm_62,compute_62\" -gencode=arch=compute_70,cod
  e=\"sm_70,compute_70\" -gencode=arch=compute_72,code=\"sm_72,compute_72\" -gencode=arch=compute_75,code=\"sm_75,compu
  te_75\" -gencode=arch=compute_80,code=\"sm_80,compute_80\" --use-local-env -ccbin "F:\Program Files (x86)\Microsoft V
  isual Studio\2019\Preview\VC\Tools\MSVC\14.28.29331\bin\HostX64\x64" -x cu   -IF:\code\thrust -IF:\code\thrust\depend
  encies\cub -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.0\include"     --keep-dir x64\Debug -maxrregcoun
  t=0  --machine 64 --compile -cudart static -Xcudafe=--display_error_number -Wno-deprecated-gpu-targets -Xcudafe=--pro
  mote_warnings -Xcompiler="/EHsc -Zi -Ob0 /WX /wd4244 /wd4267 /wd4800 /wd4146 /wd4494 /bigobj" -g   -D_WINDOWS -DTHRUS
  T_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -DNOMINMAX -D"CMAKE_INTDIR=\"De
  bug\"" -D"CMAKE_INTDIR=\"Debug\"" -D_MBCS -Xcompiler "/EHsc /W3 /nologo /Od /Fdthrust.headers.dir\Debug\thrust.header
  s.pdb /FS /Zi /RTC1 /MDd /GR" -o thrust.headers.dir\Debug\adjacent_difference.h.obj "F:\code\thrust\build\headers\thr
  ust\adjacent_difference.h.cu"
F:\code\thrust\cub\block\block_exchange.cuh(36): fatal error C1083: Cannot open include file: '../config.cuh': No such
file or directory [F:\code\thrust\build\thrust.headers.vcxproj]
  adjacent_difference.h.cu
F:\Program Files (x86)\Microsoft Visual Studio\2019\Preview\MSBuild\Microsoft\VC\v160\BuildCustomizations\CUDA 11.0.tar
gets(772,9): error MSB3721: The command ""C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.0\bin\nvcc.exe" -genco
de=arch=compute_80,code=\"compute_80,compute_80\" -gencode=arch=compute_35,code=\"sm_35,compute_35\" -gencode=arch=comp
ute_37,code=\"sm_37,compute_37\" -gencode=arch=compute_50,code=\"sm_50,compute_50\" -gencode=arch=compute_52,code=\"sm_
52,compute_52\" -gencode=arch=compute_53,code=\"sm_53,compute_53\" -gencode=arch=compute_60,code=\"sm_60,compute_60\" -
gencode=arch=compute_61,code=\"sm_61,compute_61\" -gencode=arch=compute_62,code=\"sm_62,compute_62\" -gencode=arch=comp
ute_70,code=\"sm_70,compute_70\" -gencode=arch=compute_72,code=\"sm_72,compute_72\" -gencode=arch=compute_75,code=\"sm_
75,compute_75\" -gencode=arch=compute_80,code=\"sm_80,compute_80\" --use-local-env -ccbin "F:\Program Files (x86)\Micro
soft Visual Studio\2019\Preview\VC\Tools\MSVC\14.28.29331\bin\HostX64\x64" -x cu   -IF:\code\thrust -IF:\code\thrust\de
pendencies\cub -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.0\include"     --keep-dir x64\Debug -maxrregco
unt=0  --machine 64 --compile -cudart static -Xcudafe=--display_error_number -Wno-deprecated-gpu-targets -Xcudafe=--pro
mote_warnings -Xcompiler="/EHsc -Zi -Ob0 /WX /wd4244 /wd4267 /wd4800 /wd4146 /wd4494 /bigobj" -g   -D_WINDOWS -DTHRUST_
HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -DNOMINMAX -D"CMAKE_INTDIR=\"Debug\
"" -D"CMAKE_INTDIR=\"Debug\"" -D_MBCS -Xcompiler "/EHsc /W3 /nologo /Od /Fdthrust.headers.dir\Debug\thrust.headers.pdb
/FS /Zi /RTC1 /MDd /GR" -o thrust.headers.dir\Debug\adjacent_difference.h.obj "F:\code\thrust\build\headers\thrust\adja
cent_difference.h.cu"" exited with code 2. [F:\code\thrust\build\thrust.headers.vcxproj]

Here is the configure output:

-- Building for: Visual Studio 16 2019
-- Selecting Windows SDK version 10.0.18362.0 to target Windows 10.0.19041.
-- The CXX compiler identification is MSVC 19.28.29331.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: F:/Program Files (x86)/Microsoft Visual Studio/2019/Preview/VC/Tools/MSVC/14.28.29331/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Thrust: F:/code/thrust/thrust/cmake/thrust-config.cmake (found version "1.11.0.0")
-- Found CUB: F:/code/thrust/dependencies/cub/cub/cmake/cub-config.cmake (found version "1.11.0.0")
-- Thrust: TargetInfo: thrust: (1.11.0.0)
-- Thrust: TargetInfo: thrust > IMPORTED: TRUE
-- Thrust: TargetInfo: thrust > INTERFACE_LINK_LIBRARIES: Thrust::CPP::Host;Thrust::CUDA::Device
-- Performing Test CXX_FLAG__W3
-- Performing Test CXX_FLAG__W3 - Success
-- Performing Test CXX_FLAG__WX
-- Performing Test CXX_FLAG__WX - Success
-- Performing Test CXX_FLAG__wd4244
-- Performing Test CXX_FLAG__wd4244 - Success
-- Performing Test CXX_FLAG__wd4267
-- Performing Test CXX_FLAG__wd4267 - Success
-- Performing Test CXX_FLAG__wd4800
-- Performing Test CXX_FLAG__wd4800 - Success
-- Performing Test CXX_FLAG__wd4146
-- Performing Test CXX_FLAG__wd4146 - Success
-- Performing Test CXX_FLAG__wd4494
-- Performing Test CXX_FLAG__wd4494 - Success
-- Performing Test CXX_FLAG__bigobj
-- Performing Test CXX_FLAG__bigobj - Success
-- Enabling Thrust configuration: cpp.cuda.cpp14
-- CPP system found?  TRUE
-- CUDA system found? TRUE
-- TBB system found?  FALSE
-- OMP system found?  FALSE
-- The CUDA compiler identification is NVIDIA 11.0.221
-- Detecting CUDA compiler ABI info
-- Detecting CUDA compiler ABI info - done
-- Check for working CUDA compiler: C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v11.0/bin/nvcc.exe - skipped
-- Detecting CUDA compile features
-- Detecting CUDA compile features - done
-- Thrust: Enabled CUDA architectures: sm_35 sm_37 sm_50 sm_52 sm_53 sm_60 sm_61 sm_62 sm_70 sm_72 sm_75 sm_80 compute_80
sm_53 does not support RDC. Targets that require RDC will be built without support for this architecture.
sm_62 does not support RDC. Targets that require RDC will be built without support for this architecture.
sm_72 does not support RDC. Targets that require RDC will be built without support for this architecture.
-- Configuring done
-- Generating done
-- Build files have been written to: F:/code/thrust/build
@bjude
Copy link
Contributor Author

bjude commented Oct 24, 2020

It can be fixed by adding some subdirectories to the CUB include paths in cub/cub/cmake/cub-config.cmake:

target_include_directories(_CUB_CUB INTERFACE "${_CUB_INCLUDE_DIR}/cub/block")
target_include_directories(_CUB_CUB INTERFACE "${_CUB_INCLUDE_DIR}/cub/block/specializations")

but that doesnt really feel like the right way to solve the issue.

Interestingly, the issue doesnt manifest if I try and build just CUB in isolation

@alliepiper alliepiper added the triage Needs investigation and classification. label Oct 24, 2020
@alliepiper alliepiper added this to the 1.11.0 milestone Oct 24, 2020
@alliepiper
Copy link
Collaborator

MSVC 19.28.29331.0

Where did this version of MSVC come from?

I build regularly with MSVC Community 2019, and the updater doesn't report any version newer than 19.27.29112. Everything builds just fine there :-/

Is this a pre-release developer snapshot of MSVC or something?

@alliepiper alliepiper added the info needed Cannot make progress without more information. label Oct 26, 2020
@bjude
Copy link
Contributor Author

bjude commented Oct 27, 2020

This was 16.8 preview 4 i believe, im not at that machine currently though. The preview builds dont show up in the standard installer, it think theres a separate installer for them.

I'll install 16.7 and see if I can reproduce it

@bjude
Copy link
Contributor Author

bjude commented Oct 28, 2020

same error with 19.27.29112.0, cmake 3.18.0

have you added any cmake flags when you build with MSVC? i'm just doing

cmake ..
cmake --build .

from a build dir in the root of the thrust dir

@alliepiper
Copy link
Collaborator

I just built the failing compilation units with a fresh, no-option* cmake .. on latest MSVC 2019.

*I did disable a few SM archs (all but sm_75), but that shouldn't affect this.

It's strange. The compiler is finding the CUB headers, since the error is occurring in F:\code\thrust\cub\block\block_exchange.cuh, which fails at #include "../config.h". The only include path CUB needs is already added (-IF:\code\thrust\dependencies\cub), I'm not sure why MSVC is failing to locate relative paths on your machine.

Some sanity checks:

  1. Does F:\code\thrust\cub\config.h exist on your system?
  2. Are you doing anything weird with symlinks in the source dir?
  3. If you make a small test project that uses relative includes, does it work on your system?

It probably wouldn't be a bad idea for us to sweep through the CUB codebase and convert all of the relative paths to full includes relative to the CUB include path, but it should work as-is.

@alliepiper
Copy link
Collaborator

I think I see a possible explanation. Are you using the MSVC developer prompt to configure this, or some other shell?

Here's what's weird:

Both the thrust and cub include paths are added to the nvcc invocation:

-IF:\code\thrust -IF:\code\thrust\dependencies\cub

However, the file that MSVC is complaining about is

F:\code\thrust\cub\block\block_exchange.cuh, not
F:\code\thrust\dependencies\cub\cub\block\block_exchange.cuh

This is due to a cub UNIX-style symlink that currently exists in the root of the Thrust repo. It sounds like some part of your toolchain or environment is picking up the F:\code\thrust\cub symlink to pull in the cub\block\block_exchange.cuh header, but then the preprocessor is getting confused because it sees the F:\code\thrust\cub unix symlink as a file, and the path F:\code\thrust\cub\block\..\config.cuh simply doesn't make sense to native windows tools.

I'm trying to get rid of that symlink because it's a timebomb in a cross-platform project, and it looks like you found a way to make it explode. Issue #1283 is tracking the effort to remove it in a way that won't break anyone's existing build.

As a work-around, try removing the F:\code\thrust\cub file/symlink and see if your build finishes.

cc: @brycelelbach for visibility -- symlinks are bad.

@alliepiper alliepiper changed the title Build failure due to relative includes in CUB MSVC build fails: fatal error C1083: Cannot open include file: '../config.cuh': No such file or directory Oct 28, 2020
@alliepiper
Copy link
Collaborator

Updated title to improve google-ability of the underlying issue. Others may hit this before we get it fixed.

@bjude
Copy link
Contributor Author

bjude commented Oct 29, 2020

Yep thats it! Deleting the symlink fixes the build.

What problem did the symlink solve? I imagine the CUB include dirs could be handled by the CUB cmake target include directories

For reference I tried building from the MSVC developer prompt and from VS Code (which i'm pretty sure uses the vcvars.bat from the dev prompt under the hood).

@alliepiper
Copy link
Collaborator

What problem did the symlink solve? I imagine the CUB include dirs could be handled by the CUB cmake target include directories

Some people wanted the simplicity of just doing -I/path/to/thrust/checkout/, but they only tested on linux. Now that several years have passed, we worry that removing the symlink will break downstream builds. We're considering a monorepo for thrust/cub anyway, so we're going to punt on fixing the symlink until we decide on a path to the monorepo, which will eliminate the symlink problem.

For reference I tried building from the MSVC developer prompt and from VS Code (which i'm pretty sure uses the vcvars.bat from the dev prompt under the hood).

Ah, I thought maybe you were using some WSL or git bash setup that would understand the unix symlink and generate the bad path before it got to MSVC's preprocessor and choked.

Is this breaking / blocking your projects, or is deleting the symlink a reasonable workaround until we get the monorepo?

@alliepiper alliepiper removed the info needed Cannot make progress without more information. label Oct 30, 2020
@alliepiper alliepiper changed the title MSVC build fails: fatal error C1083: Cannot open include file: '../config.cuh': No such file or directory MSVC build fails due to cub symlink: fatal error C1083: Cannot open include file: '../config.cuh': No such file or directory Nov 2, 2020
@alliepiper alliepiper added type: bug: functional Does not work as intended. and removed triage Needs investigation and classification. labels Nov 2, 2020
@alliepiper alliepiper modified the milestones: 1.11.0, Backlog Nov 2, 2020
@bjude
Copy link
Contributor Author

bjude commented Nov 3, 2020

Its not really blocking any projects, i only tend to use the github branch for thrust to contribute to thrust, not for real work. Deleting the symlink and disabling warnings-as-errors is a fine workaround for now

FWIW, a monorepo shouldnt be necessary if the relative includes are removed and cmake projects are set up properly. obviously that could break people to -I/path/to/thrust but we dont want to be supporting such bad behaviour do we :P

@alliepiper
Copy link
Collaborator

The monorepo is to address many issues, not just the symlink. Thrust and CUB are version locked, tightly coupled, and share a lot of build infrastructure. They're effectively two layers of abstraction providing the same functionality, so it makes sense to combine the code bases.

FWIW, a monorepo shouldnt be necessary if the relative includes are removed and cmake projects are set up properly.

I'm not sure what you mean here, the relative includes internal to each project should be fine, and we have a new CMake config that seems to be working fairly well for most users. Have you had other issues related to these?

@KernelA
Copy link

KernelA commented Jul 23, 2021

I have a same issue when using thrust with CMake on GitHub Action windows-2019 virtual machine.

Logs:

2021-07-23T12:20:44.0344123Z ##[group]Run $trimeshDir = "..\trimesh2-build"
2021-07-23T12:20:44.0344888Z �[36;1m$trimeshDir = "..\trimesh2-build"�[0m
2021-07-23T12:20:44.0345317Z �[36;1mcmake -A x64 `�[0m
2021-07-23T12:20:44.0345924Z �[36;1m-DCMAKE_TOOLCHAIN_FILE:FILEPATH="C:\vcpkg\scripts\buildsystems\vcpkg.cmake" `�[0m
2021-07-23T12:20:44.0346693Z �[36;1m-DTrimesh2_INCLUDE_DIR:PATH="$trimeshDir\include" `�[0m
2021-07-23T12:20:44.0347349Z �[36;1m-DTrimesh2_LINK_DIR:PATH="$trimeshDir\lib.Win64.vs142" `�[0m
2021-07-23T12:20:44.0347944Z �[36;1m-DCUDA_ARCH:STRING=$env:CUDA_ARCH `�[0m
2021-07-23T12:20:44.0348498Z �[36;1m-DCMAKE_BUILD_TYPE=Release `�[0m
2021-07-23T12:20:44.0349098Z �[36;1m-DThrust_DIR=D:\a\cuda_voxelizer\cuda_voxelizer\..\thrust-repo\thrust\cmake `�[0m
2021-07-23T12:20:44.0349646Z �[36;1m-S . -B .\build�[0m
2021-07-23T12:20:44.0392387Z shell: C:\Program Files\PowerShell\7\pwsh.EXE -command ". '{0}'"
2021-07-23T12:20:44.0392888Z env:
2021-07-23T12:20:44.0393266Z   CUDA_MAJOR_VERSION: 11.3
2021-07-23T12:20:44.0393740Z   CUDA_PATCH_VERSION: 1
2021-07-23T12:20:44.0394157Z   TRIMESH_VERSION: 2020.03.04
2021-07-23T12:20:44.0394513Z   CUDAARCHS: 60
2021-07-23T12:20:44.0394922Z   NVIDIA_TRUST_VERSION: cuda-11.3
2021-07-23T12:20:44.0395478Z   CUDA_PATH: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3
2021-07-23T12:20:44.0396184Z   CUDA_PATH_V11_3: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3
2021-07-23T12:20:44.0396738Z   CUDA_PATH_VX_Y: CUDA_PATH_V11_3
2021-07-23T12:20:44.0397107Z ##[endgroup]
2021-07-23T12:20:44.4055037Z -- Building for: Visual Studio 16 2019
2021-07-23T12:20:56.5812233Z -- The CXX compiler identification is MSVC 19.29.30038.1
2021-07-23T12:21:02.5776535Z -- The CUDA compiler identification is NVIDIA 11.3.109
2021-07-23T12:21:02.6732984Z -- Detecting CXX compiler ABI info
2021-07-23T12:21:05.0103549Z -- Detecting CXX compiler ABI info - done
2021-07-23T12:21:05.0127822Z -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Enterprise/VC/Tools/MSVC/14.29.30037/bin/Hostx64/x64/cl.exe - skipped
2021-07-23T12:21:05.0133657Z -- Detecting CXX compile features
2021-07-23T12:21:05.0160687Z -- Detecting CXX compile features - done
2021-07-23T12:21:05.0333761Z -- Detecting CUDA compiler ABI info
2021-07-23T12:21:08.0203734Z -- Detecting CUDA compiler ABI info - done
2021-07-23T12:21:08.0539020Z -- Check for working CUDA compiler: C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v11.3/bin/nvcc.exe - skipped
2021-07-23T12:21:08.0543309Z -- Detecting CUDA compile features
2021-07-23T12:21:08.0549805Z -- Detecting CUDA compile features - done
2021-07-23T12:21:11.0095399Z -- Found OpenMP_CXX: -openmp (found version "2.0") 
2021-07-23T12:21:11.0100286Z -- Found OpenMP: TRUE (found version "2.0")  
2021-07-23T12:21:11.0831989Z -- Found CUDAToolkit: C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v11.3/include (found version "11.3.109") 
2021-07-23T12:21:15.4678404Z -- Found Trimesh2 include: D:/a/cuda_voxelizer/trimesh2-build/include/TriMesh.h
2021-07-23T12:21:15.4679974Z -- Found Trimesh2 lib: D:/a/cuda_voxelizer/trimesh2-build/lib.Win64.vs142/trimesh.lib
2021-07-23T12:21:15.4829470Z -- Found Thrust: D:/a/cuda_voxelizer/thrust-repo/thrust/cmake/thrust-config.cmake (found version "1.11.0.0") 
2021-07-23T12:21:15.4888228Z -- Found CUB: D:/a/cuda_voxelizer/thrust-repo/dependencies/cub/cub/cmake/cub-config.cmake (found version "1.11.0.0") 
2021-07-23T12:21:15.4911045Z -- Configuring done
2021-07-23T12:21:15.5498164Z -- Generating done
2021-07-23T12:21:15.5505602Z CMake Warning:
2021-07-23T12:21:15.5506426Z   Manually-specified variables were not used by the project:
2021-07-23T12:21:15.5507059Z 
2021-07-23T12:21:15.5507554Z     CUDA_ARCH
2021-07-23T12:21:15.5507929Z 
2021-07-23T12:21:15.5508271Z 
2021-07-23T12:21:15.5516155Z -- Build files have been written to: D:/a/cuda_voxelizer/cuda_voxelizer/build



2021-07-23T12:21:15.7235016Z ##[group]Run cmake --build .\build --parallel 2 --target ALL_BUILD --config Release
2021-07-23T12:21:15.7235916Z �[36;1mcmake --build .\build --parallel 2 --target ALL_BUILD --config Release�[0m
2021-07-23T12:21:15.7276617Z shell: C:\Program Files\PowerShell\7\pwsh.EXE -command ". '{0}'"
2021-07-23T12:21:15.7277144Z env:
2021-07-23T12:21:15.7277514Z   CUDA_MAJOR_VERSION: 11.3
2021-07-23T12:21:15.7277954Z   CUDA_PATCH_VERSION: 1
2021-07-23T12:21:15.7278332Z   TRIMESH_VERSION: 2020.03.04
2021-07-23T12:21:15.7278728Z   CUDAARCHS: 60
2021-07-23T12:21:15.7279157Z   NVIDIA_TRUST_VERSION: cuda-11.3
2021-07-23T12:21:15.7279712Z   CUDA_PATH: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3
2021-07-23T12:21:15.7280408Z   CUDA_PATH_V11_3: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3
2021-07-23T12:21:15.7280974Z   CUDA_PATH_VX_Y: CUDA_PATH_V11_3
2021-07-23T12:21:15.7281362Z ##[endgroup]
2021-07-23T12:21:16.1803944Z Microsoft (R) Build Engine version 16.10.2+857e5a733 for .NET Framework
2021-07-23T12:21:16.1805376Z Copyright (C) Microsoft Corporation. All rights reserved.
2021-07-23T12:21:16.1805954Z 
2021-07-23T12:21:16.6176867Z   Checking Build System
2021-07-23T12:21:16.7999109Z   Building Custom Rule D:/a/cuda_voxelizer/cuda_voxelizer/CMakeLists.txt
2021-07-23T12:21:17.6273841Z   Compiling CUDA source file ..\src\voxelize.cu...
2021-07-23T12:21:18.7586958Z   
2021-07-23T12:21:18.7608519Z   D:\a\cuda_voxelizer\cuda_voxelizer\build>"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\bin\nvcc.exe" -gencode=arch=compute_60,code=\"compute_60,compute_60\" -gencode=arch=compute_60,code=\"sm_60,compute_60\" --use-local-env -ccbin "C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\VC\Tools\MSVC\14.29.30037\bin\HostX64\x64" -x cu   -I"D:\a\cuda_voxelizer\cuda_voxelizer\..\trimesh2-build\include" -I"D:\a\cuda_voxelizer\thrust-repo" -I"D:\a\cuda_voxelizer\thrust-repo\dependencies\cub" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\include" -I"C:\vcpkg\installed\x64-windows\include" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\include"     --keep-dir x64\Release  -maxrregcount=0  --machine 64 --compile -cudart static -std=c++17 -Xcompiler="/EHsc -Ob2"   -D_WINDOWS -DNDEBUG -DTHRUST_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -D"CMAKE_INTDIR=\"Release\"" -D_MBCS -DWIN32 -D_WINDOWS -DNDEBUG -DTHRUST_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -D"CMAKE_INTDIR=\"Release\"" -Xcompiler "/EHsc /W1 /nologo /O2 /Fdcuda_voxelizer.dir\Release\vc142.pdb /FS   /MD " -o cuda_voxelizer.dir\Release\voxelize.obj "D:\a\cuda_voxelizer\cuda_voxelizer\src\voxelize.cu" 
2021-07-23T12:21:18.7613542Z   Compiling CUDA source file ..\src\thrust_operations.cu...
2021-07-23T12:21:18.8483861Z   
2021-07-23T12:21:18.8501921Z   D:\a\cuda_voxelizer\cuda_voxelizer\build>"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\bin\nvcc.exe" -gencode=arch=compute_60,code=\"compute_60,compute_60\" -gencode=arch=compute_60,code=\"sm_60,compute_60\" --use-local-env -ccbin "C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\VC\Tools\MSVC\14.29.30037\bin\HostX64\x64" -x cu   -I"D:\a\cuda_voxelizer\cuda_voxelizer\..\trimesh2-build\include" -I"D:\a\cuda_voxelizer\thrust-repo" -I"D:\a\cuda_voxelizer\thrust-repo\dependencies\cub" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\include" -I"C:\vcpkg\installed\x64-windows\include" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\include"     --keep-dir x64\Release  -maxrregcount=0  --machine 64 --compile -cudart static -std=c++17 -Xcompiler="/EHsc -Ob2"   -D_WINDOWS -DNDEBUG -DTHRUST_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -D"CMAKE_INTDIR=\"Release\"" -D_MBCS -DWIN32 -D_WINDOWS -DNDEBUG -DTHRUST_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -D"CMAKE_INTDIR=\"Release\"" -Xcompiler "/EHsc /W1 /nologo /O2 /Fdcuda_voxelizer.dir\Release\vc142.pdb /FS   /MD " -o cuda_voxelizer.dir\Release\thrust_operations.obj "D:\a\cuda_voxelizer\cuda_voxelizer\src\thrust_operations.cu" 
2021-07-23T12:21:18.8531531Z D:\a\cuda_voxelizer\thrust-repo\cub\block\block_exchange.cuh(36): fatal error C1083: Cannot open include file: '../config.cuh': No such file or directory [D:\a\cuda_voxelizer\cuda_voxelizer\build\cuda_voxelizer.vcxproj]
2021-07-23T12:21:18.8564925Z   thrust_operations.cu

@alliepiper alliepiper added P0: must have Absolutely necessary. Critical issue, major blocker, etc. release: breaking change Include in "Breaking Changes" section of release notes. labels Apr 25, 2022
@alliepiper alliepiper self-assigned this Apr 25, 2022
alliepiper added a commit to alliepiper/thrust that referenced this issue May 20, 2022
This breaks builds on some toolchains.

Users should explicitly set their include directories for Thrust's
dependencies:

```
-I ${THRUST_ROOT}/dependencies/cub
-I ${THRUST_ROOT}/dependencies/libcudacxx/include
```

If using Thrust's CMake packages, these paths will be configured
automatically.

Fixes NVIDIA#1328.
@alliepiper
Copy link
Collaborator

This will no longer be an issue after Thrust 2.0 🎉

#1701

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
P0: must have Absolutely necessary. Critical issue, major blocker, etc. release: breaking change Include in "Breaking Changes" section of release notes. type: bug: functional Does not work as intended.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants