Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
set nvidia docker true to run imagenet inference on GPU
Browse files Browse the repository at this point in the history
  • Loading branch information
ChaiBapchya committed Oct 24, 2019
1 parent ef94c5d commit 98f8eef
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion tests/nightly/JenkinsfileForBinaries
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ core_logic: {
node(NODE_LINUX_CPU) {
ws('workspace/build-mkldnn-gpu') {
utils.init_git()
utils.docker_run('ubuntu_build_cuda', 'build_ubuntu_gpu_mkldnn', false)
utils.docker_run('ubuntu_build_cuda', 'build_ubuntu_gpu_mkldnn', true)

This comment has been minimized.

Copy link
@marcoabreu

marcoabreu Oct 24, 2019

Contributor

A cpu instance does not have GPUs. Also you don't need GPUs for compilation.

This comment has been minimized.

Copy link
@ChaiBapchya

ChaiBapchya Oct 24, 2019

Author Contributor

Good catch! Reverting!

utils.pack_lib('gpu', mx_lib_cpp_example_mkl)
}
}
Expand Down

0 comments on commit 98f8eef

Please sign in to comment.