Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion DEVELOPER.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@

Envoy is built using the Bazel build system. CircleCI builds, tests, and runs coverage against all pull requests and the master branch.

To get started building Envoy locally, see the [Bazel quick start](https://github.com/envoyproxy/envoy/blob/master/bazel/README.md#quick-start-bazel-build-for-developers). To run tests, there are Bazel [targets](https://github.com/envoyproxy/envoy/blob/master/bazel/README.md#testing-envoy-with-bazel) for Google Test. To generate a coverage report, use the tooling for [gcovr](https://github.com/envoyproxy/envoy/blob/master/bazel/README.md#coverage-builds).
To get started building Envoy locally, see the [Bazel quick start](https://github.com/envoyproxy/envoy/blob/master/bazel/README.md#quick-start-bazel-build-for-developers).
To run tests, there are Bazel [targets](https://github.com/envoyproxy/envoy/blob/master/bazel/README.md#testing-envoy-with-bazel) for Google Test.
To generate a coverage report, there is a [coverage build script](https://github.com/envoyproxy/envoy/blob/master/bazel/README.md#coverage-builds).

If you plan to contribute to Envoy, you may find it useful to install the Envoy [development support toolchain](https://github.com/envoyproxy/envoy/blob/master/support/README.md), which helps automate parts of the development process, particularly those involving code review.

Expand Down
6 changes: 2 additions & 4 deletions bazel/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -435,10 +435,8 @@ https://github.com/bazelbuild/bazel/issues/2805.

# Coverage builds

To generate coverage results, make sure you have
[`gcovr`](https://github.com/gcovr/gcovr) 3.3 in your `PATH` (or set `GCOVR` to
point at it) and are using a GCC toolchain (clang does not work currently, see
https://github.com/envoyproxy/envoy/issues/1000). Then run:
To generate coverage results, make sure you are using a clang toolchain and have `llvm-cov` and
`llvm-profdata` in your `PATH`. Then run:

```
test/run_envoy_bazel_coverage.sh
Expand Down
24 changes: 0 additions & 24 deletions bazel/external/gcovr.BUILD

This file was deleted.

14 changes: 0 additions & 14 deletions bazel/repositories.bzl
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,6 @@ def envoy_dependencies(skip_targets = []):
_com_github_envoyproxy_sqlparser()
_com_github_fmtlib_fmt()
_com_github_gabime_spdlog()
_com_github_gcovr_gcovr()
_com_github_google_benchmark()
_com_github_google_jwt_verify()
_com_github_google_libprotobuf_mutator()
Expand All @@ -164,9 +163,6 @@ def envoy_dependencies(skip_targets = []):
_io_opentracing_cpp()
_net_zlib()

# Used for bundling gcovr into a relocatable .par file.
_repository_impl("subpar")

_python_deps()
_cc_deps()
_go_deps(skip_targets)
Expand Down Expand Up @@ -257,16 +253,6 @@ def _com_github_gabime_spdlog():
actual = "@com_github_gabime_spdlog//:spdlog",
)

def _com_github_gcovr_gcovr():
_repository_impl(
name = "com_github_gcovr_gcovr",
build_file = "@envoy//bazel/external:gcovr.BUILD",
)
native.bind(
name = "gcovr",
actual = "@com_github_gcovr_gcovr//:gcovr",
)

def _com_github_google_benchmark():
location = REPOSITORY_LOCATIONS["com_github_google_benchmark"]
http_archive(
Expand Down
12 changes: 0 additions & 12 deletions bazel/repository_locations.bzl
Original file line number Diff line number Diff line change
Expand Up @@ -66,11 +66,6 @@ REPOSITORY_LOCATIONS = dict(
strip_prefix = "spdlog-1.3.1",
urls = ["https://github.com/gabime/spdlog/archive/v1.3.1.tar.gz"],
),
com_github_gcovr_gcovr = dict(
sha256 = "1c52a71f245adfe1b45e30fbe5015337fe66546f17f40038b3969b7b42acceed",
strip_prefix = "gcovr-3.4",
urls = ["https://github.com/gcovr/gcovr/archive/3.4.tar.gz"],
),
com_github_google_libprotobuf_mutator = dict(
sha256 = "97b3639630040f41c45f45838ab00b78909e6b4cb69c8028e01302bea5b79495",
strip_prefix = "libprotobuf-mutator-c3d2faf04a1070b0b852b0efdef81e1a81ba925e",
Expand Down Expand Up @@ -229,13 +224,6 @@ REPOSITORY_LOCATIONS = dict(
sha256 = "105f8d68616f8248e24bf0e9372ef04d3cc10104f1980f54d57b2ce73a5ad56a",
urls = ["https://pypi.python.org/packages/source/s/six/six-1.10.0.tar.gz#md5=34eed507548117b2ab523ab14b2f8b55"],
),
# I'd love to name this `com_github_google_subpar`, but something in the Subpar
# code assumes its repository name is just `subpar`.
subpar = dict(
sha256 = "b80297a1b8d38027a86836dbadc22f55dc3ecad56728175381aa6330705ac10f",
strip_prefix = "subpar-2.0.0",
urls = ["https://github.com/google/subpar/archive/2.0.0.tar.gz"],
),
io_opencensus_cpp = dict(
sha256 = "d6d68704c419a9e892bd1f942e09509ebc5a318499a1abcf2c09734e5dc56e19",
strip_prefix = "opencensus-cpp-1145dd77ffb7a2845c71c8e6ca188ef55e4ff607",
Expand Down
15 changes: 0 additions & 15 deletions ci/build_setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -40,21 +40,6 @@ then
fi
export ENVOY_FILTER_EXAMPLE_SRCDIR="${BUILD_DIR}/envoy-filter-example"

# Make sure that /source doesn't contain /build on the underlying host
# filesystem, including via hard links or symlinks. We can get into weird
# loops with Bazel symlinking and gcovr's path traversal if this is true, so
# best to keep /source and /build in distinct directories on the host
# filesystem.
SENTINEL="${BUILD_DIR}"/bazel.sentinel
touch "${SENTINEL}"
if [[ -n "$(find -L "${ENVOY_SRCDIR}" -name "$(basename "${SENTINEL}")")" ]]
then
rm -f "${SENTINEL}"
echo "/source mount must not contain /build mount"
exit 1
fi
rm -f "${SENTINEL}"

# Environment setup.
export USER=bazel
export TEST_TMPDIR=${BUILD_DIR}/tmp
Expand Down
17 changes: 3 additions & 14 deletions ci/do_ci.sh
Original file line number Diff line number Diff line change
Expand Up @@ -241,22 +241,11 @@ elif [[ "$CI_TARGET" == "bazel.api" ]]; then
@envoy_api//tools:tap2pcap_test
exit 0
elif [[ "$CI_TARGET" == "bazel.coverage" ]]; then
setup_gcc_toolchain
setup_clang_toolchain
echo "bazel coverage build with tests ${TEST_TARGETS}"

# gcovr is a pain to run with `bazel run`, so package it up into a
# relocatable and hermetic-ish .par file.
bazel build --python_version=PY2 @com_github_gcovr_gcovr//:gcovr.par
export GCOVR="/tmp/gcovr.par"
cp -f "${ENVOY_SRCDIR}/bazel-bin/external/com_github_gcovr_gcovr/gcovr.par" ${GCOVR}

# Reduce the amount of memory and number of cores Bazel tries to use to
# prevent it from launching too many subprocesses. This should prevent the
# system from running out of memory and killing tasks. See discussion on
# https://github.com/envoyproxy/envoy/pull/5611.
# TODO(akonradi): use --local_cpu_resources flag once Bazel has a release
# after 0.21.
[ -z "$CIRCLECI" ] || export BAZEL_BUILD_OPTIONS="${BAZEL_BUILD_OPTIONS} --local_resources=12288,4,1"
# LLVM coverage is a memory hog too.
[ -z "$CIRCLECI" ] || export BAZEL_BUILD_OPTIONS="${BAZEL_BUILD_OPTIONS} --local_cpu_resources=6"

test/run_envoy_bazel_coverage.sh ${TEST_TARGETS}
collect_build_profile coverage
Expand Down
16 changes: 0 additions & 16 deletions test/coverage/gcc_only_test/BUILD

This file was deleted.

12 changes: 0 additions & 12 deletions test/coverage/gcc_only_test/gcc_only_test.cc

This file was deleted.

7 changes: 0 additions & 7 deletions test/coverage/gen_build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,6 @@ if [ -n "${EXTRA_QUERY_PATHS}" ]; then
TARGETS="$TARGETS $("${BAZEL_BIN}" query ${BAZEL_QUERY_OPTIONS} "attr('tags', 'coverage_test_lib', ${EXTRA_QUERY_PATHS})" | grep "^//")"
fi

# gcov requires gcc
if [ "${NO_GCOV}" != 1 ]
then
# Here we use the synthetic library target created by envoy_build_system.bzl
TARGETS="${TARGETS} ${REPOSITORY}//test/coverage/gcc_only_test:gcc_only_test_lib_internal_only"
fi

(
cat << EOF
# This file is generated by test/coverage/gen_build.sh automatically prior to
Expand Down
102 changes: 19 additions & 83 deletions test/run_envoy_bazel_coverage.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,21 +3,11 @@
set -e

[[ -z "${SRCDIR}" ]] && SRCDIR="${PWD}"
[[ -z "${GCOVR_DIR}" ]] && GCOVR_DIR="${SRCDIR}/bazel-$(basename "${SRCDIR}")"
[[ -z "${TESTLOGS_DIR}" ]] && TESTLOGS_DIR="${SRCDIR}/bazel-testlogs"
[[ -z "${BAZEL_COVERAGE}" ]] && BAZEL_COVERAGE=bazel
[[ -z "${GCOVR}" ]] && GCOVR=gcovr
[[ -z "${WORKSPACE}" ]] && WORKSPACE=envoy
[[ -z "${VALIDATE_COVERAGE}" ]] && VALIDATE_COVERAGE=true

echo "Starting run_envoy_bazel_coverage.sh..."
echo " PWD=$(pwd)"
echo " SRCDIR=${SRCDIR}"
echo " GCOVR_DIR=${GCOVR_DIR}"
echo " TESTLOGS_DIR=${TESTLOGS_DIR}"
echo " BAZEL_COVERAGE=${BAZEL_COVERAGE}"
echo " GCOVR=${GCOVR}"
echo " WORKSPACE=${WORKSPACE}"
echo " VALIDATE_COVERAGE=${VALIDATE_COVERAGE}"

# This is the target that will be run to generate coverage data. It can be overridden by consumer
Expand All @@ -31,97 +21,43 @@ else
COVERAGE_TARGETS=//test/...
fi

# This is where we are going to copy the .gcno files into.
GCNO_ROOT=bazel-out/k8-dbg/bin/test/coverage/coverage_tests.runfiles/"${WORKSPACE}"
echo " GCNO_ROOT=${GCNO_ROOT}"

echo "Cleaning .gcno from previous coverage runs..."
NUM_PREVIOUS_GCNO_FILES=0
for f in $(find -L "${GCNO_ROOT}" -name "*.gcno")
do
rm -f "${f}"
let NUM_PREVIOUS_GCNO_FILES=NUM_PREVIOUS_GCNO_FILES+1
done
echo "Cleanup completed. ${NUM_PREVIOUS_GCNO_FILES} files deleted."

# Make sure //test/coverage:coverage_tests is up-to-date.
SCRIPT_DIR="$(realpath "$(dirname "$0")")"
(BAZEL_BIN="${BAZEL_COVERAGE}" "${SCRIPT_DIR}"/coverage/gen_build.sh ${COVERAGE_TARGETS})

echo "Cleaning .gcda/.gcov from previous coverage runs..."
NUM_PREVIOUS_GCOV_FILES=0
for f in $(find -L "${GCOVR_DIR}" -name "*.gcda" -o -name "*.gcov")
do
rm -f "${f}"
let NUM_PREVIOUS_GCOV_FILES=NUM_PREVIOUS_GCOV_FILES+1
done
echo "Cleanup completed. ${NUM_PREVIOUS_GCOV_FILES} files deleted."
"${SCRIPT_DIR}"/coverage/gen_build.sh ${COVERAGE_TARGETS}

# Force dbg for path consistency later, don't include debug code in coverage.
BAZEL_BUILD_OPTIONS="${BAZEL_BUILD_OPTIONS} -c dbg --copt=-DNDEBUG"

# Run all tests under "bazel test", no sandbox. We're going to generate the
# .gcda inplace in the bazel-out/ directory. This is in contrast to the "bazel
# coverage" method, which is currently broken for C++ (see
# https://github.com/bazelbuild/bazel/issues/1118). This works today as we have
# a single coverage test binary and do not require the "bazel coverage" support
# for collecting multiple traces and glueing them together.
"${BAZEL_COVERAGE}" test //test/coverage:coverage_tests ${BAZEL_BUILD_OPTIONS} \
--cache_test_results=no --cxxopt="--coverage" --cxxopt="-DENVOY_CONFIG_COVERAGE=1" \
--linkopt="--coverage" --define ENVOY_CONFIG_COVERAGE=1 --test_output=streamed \
--strategy=Genrule=standalone --spawn_strategy=standalone --test_timeout=4000 \
--test_arg="--log-path /dev/null" --test_arg="-l trace"

# The Bazel build has a lot of whack in it, in particular generated files, headers from external
# deps, etc. So, we exclude this from gcov to avoid false reporting of these files in the html and
# stats. The #foo# pattern is because gcov produces files such as
# bazel-out#local-fastbuild#bin#external#spdlog_git#_virtual_includes#spdlog#spdlog#details#pattern_formatter_impl.h.gcov.
# To find these while modifying this regex, perform a gcov run with -k set.
[[ -z "${GCOVR_EXCLUDE_REGEX}" ]] && GCOVR_EXCLUDE_REGEX=".*pb\\..*|test#.*|.*#test#.*|external#.*|.*#external#.*|.*#prebuilt#.*|.*#config_validation#.*|.*#chromium_url#.*"
[[ -z "${GCOVR_EXCLUDE_DIR}" ]] && GCOVR_EXCLUDE_DIR=".*/external/.*"
BAZEL_USE_LLVM_NATIVE_COVERAGE=1 GCOV=llvm-profdata bazel coverage ${BAZEL_BUILD_OPTIONS} \
-c fastbuild --copt=-DNDEBUG --instrumentation_filter=//source/...,//include/... \
--test_timeout=2000 --cxxopt="-DENVOY_CONFIG_COVERAGE=1" --test_output=streamed \
--test_arg="--log-path /dev/null" --test_arg="-l trace" --test_env=HEAPCHECK= \
//test/coverage:coverage_tests

COVERAGE_DIR="${SRCDIR}"/generated/coverage
mkdir -p "${COVERAGE_DIR}"
COVERAGE_SUMMARY="${COVERAGE_DIR}/coverage_summary.txt"

# Copy .gcno objects into the same location that we find the .gcda.
# TODO(htuch): Should use rsync, but there are some symlink loops to fight.
echo "Finding and copying .gcno files in GCOVR_DIR: ${GCOVR_DIR}"
mkdir -p ${GCNO_ROOT}
NUM_GCNO_FILES=0
for f in $(find -L bazel-out/ -name "*.gcno")
do
cp --parents "$f" ${GCNO_ROOT}/
let NUM_GCNO_FILES=NUM_GCNO_FILES+1
done
echo "OK: copied ${NUM_GCNO_FILES} .gcno files"

# gcovr is extremely picky about where it is run and where the paths of the
# original source are relative to its execution location.
cd -P "${GCOVR_DIR}"
echo "Running gcovr in $(pwd)..."
time "${GCOVR}" -v --gcov-exclude="${GCOVR_EXCLUDE_REGEX}" \
--exclude-directories="${GCOVR_EXCLUDE_DIR}" -r . \
--html --html-details --exclude-unreachable-branches --print-summary \
-o "${COVERAGE_DIR}"/coverage.html > "${COVERAGE_SUMMARY}"
COVERAGE_IGNORE_REGEX="(/external/|pb\.(validate\.)?(h|cc)|/chromium_url/|/test/|/tmp)"
COVERAGE_BINARY="bazel-bin/test/coverage/coverage_tests"
COVERAGE_DATA="bazel-out/k8-fastbuild/testlogs/test/coverage/coverage_tests/coverage.dat"

# Clean up the generated test/coverage/BUILD file: subsequent bazel invocations
# can choke on it if it references things that changed since the last coverage
# run.
rm "${SRCDIR}"/test/coverage/BUILD
echo "Generating report..."
llvm-cov show "${COVERAGE_BINARY}" -instr-profile="${COVERAGE_DATA}" -Xdemangler=c++filt \
-ignore-filename-regex="${COVERAGE_IGNORE_REGEX}" -output-dir=${COVERAGE_DIR} -format=html
sed -i -e 's|>proc/self/cwd/|>|g' "${COVERAGE_DIR}/index.html"
sed -i -e 's|>bazel-out/[^/]*/bin/\([^/]*\)/[^<]*/_virtual_includes/[^/]*|>\1|g' "${COVERAGE_DIR}/index.html"

[[ -z "${ENVOY_COVERAGE_DIR}" ]] || rsync -av "${COVERAGE_DIR}"/ "${ENVOY_COVERAGE_DIR}"

if [ "$VALIDATE_COVERAGE" == "true" ]
then
COVERAGE_VALUE=$(grep -Po 'lines: \K(\d|\.)*' "${COVERAGE_SUMMARY}")
COVERAGE_THRESHOLD=97.5
COVERAGE_VALUE=$(llvm-cov export "${COVERAGE_BINARY}" -instr-profile="${COVERAGE_DATA}" \
-ignore-filename-regex="${COVERAGE_IGNORE_REGEX}" -summary-only | \
python3 -c "import sys, json; print(json.load(sys.stdin)['data'][0]['totals']['lines']['percent'])")
COVERAGE_THRESHOLD=96.9
COVERAGE_FAILED=$(echo "${COVERAGE_VALUE}<${COVERAGE_THRESHOLD}" | bc)
if test ${COVERAGE_FAILED} -eq 1; then
echo Code coverage ${COVERAGE_VALUE} is lower than limit of ${COVERAGE_THRESHOLD}
exit 1
else
echo Code coverage ${COVERAGE_VALUE} is good and higher than limit of ${COVERAGE_THRESHOLD}
fi
echo "HTML coverage report is in ${COVERAGE_DIR}/coverage.html"
fi
echo "HTML coverage report is in ${COVERAGE_DIR}/index.html"