Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GitLab-specific instructions not working #30

Closed
rsenden opened this issue Oct 5, 2021 · 14 comments
Closed

GitLab-specific instructions not working #30

rsenden opened this issue Oct 5, 2021 · 14 comments

Comments

@rsenden
Copy link
Contributor

rsenden commented Oct 5, 2021

Due to a change in configuration file handling, the GitLab-specific instructions in the README.md file are no longer functional. GitLab seems to be passing sh to the Docker container, which was ignored by older versions of the image but FortifyVulnerabilityExporter version 1.5.0 tries to interpret this as a configuration file specification. As there is no configuration file named sh, FortifyVulnerabilityExporter will fail.

As a quick work-around, please try using the older version of the Docker image using the following image instruction in your .gitlab-ci.yml:

image: fortifydocker/fortify-vulnerability-exporter:v1.4.1

Alternatively, you can use FortifyToolsInstaller to install and run FortifyVulnerabilityExporter in any arbitrary container; see https://gitlab.com/Fortify/example-eightball/-/blob/master/.gitlab-ci.yml#L58 for an example.

@ferricoxide
Copy link

Was trying to integrate a self-hosted SSC with self-hosted GitLab instance and had been running into the aforementioned sh issue. The selection of the v1.4.1 tagged container addressed that issue. Unfortunately, our SSC (and GitLab) are using SSL certificates signed by a private CA. So, I'm (not surprisingly) getting:

Caused by: javax.ws.rs.ProcessingException: javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

Failure-exits. Is there a recommended workaround for this, given the nature of my testing-environment?

@rsenden
Copy link
Contributor Author

rsenden commented Jan 13, 2022

I don't think there's an easy way to have the Docker image use a custom trust store, but you could consider one of the following work-arounds:

  • Build a custom image that uses fortifydocker/fortify-vulnerability-exporter as its base image, adding the necessary certificates to the cacerts file
  • Run FortifyVulnerabilityExporter on a different image that already has the necessary certificates (and any other prerequisites) installed. Easiest way to do so is to use FortifyToolsInstaller as demonstrated here: https://gitlab.com/Fortify/example-eightball/-/blob/master/.gitlab-ci.yml#L58

@ferricoxide
Copy link

Ultimately, I ended up doing:

  • Build a custom image that uses fortifydocker/fortify-vulnerability-exporter as its base image, adding the necessary certificates to the cacerts file

In my case, since the SSC server's certificate was signed by an intermediate-CA, I had to import both the root-CA and the intermediate CAs in order for the exporter to be happy with things.

For anyone running into a similar scenario to mine and Googled their way to this issue-thread, this entailed:

  1. Exporting the SSC server's root CA in DER format

  2. Exporting the SSC servers intermediate CAs in DER format

  3. Adding a COPY operation to pull the DER-formatted certificate-file into the image-build

  4. Invoking keytool to import the CAs' DER-formatted certificate-files into the image's Java keystore:

    echo yes | /opt/java/openjdk/bin/keytool -import -trustcacerts \
      -alias <CA_NAME> -file /tmp/<CA_DER_FILE> \
      -keystore /opt/java/openjdk/lib/security/cacerts \
      -storepass changeit && \
    

    Via a RUN sequence. Note that:

    • I had to pass the -alias option since I was injecting more than one CA
    • The password-string for the -storepass is the OpenJDK's default value: keytool requires this string as part of the operation.
    • The keytool utility expects an acknowledgement of the import-request: the echo yes | before the keytool invocation takes care of providing that acknowledgement.

@rsenden
Copy link
Contributor Author

rsenden commented Jan 20, 2022

Thanks for sharing the instructions! I'll look at adding some information on using custom certificates in the documentation for a future version of FortifyVulnerabilityExporter; for now users can refer to the information you shared.

@ferricoxide
Copy link

Note: still getting issues when trying to use anything newer than the 1.4.1 version.

It's more of a problem, now, because the customer I'm doing work for doesn't feel comfortable using images off of DockerHub and wants me to repackage things onto UBI8. Tried gleaning useful information from this project's GitHub Actions content. However, unless I use -b v1.4.1 on my clone operation, I get the 1.5.x code-base which results in GitLab throwing the sh error. When I do use the -b v1.4.1 to create my container, the jobs that run from that container abort with:

2022-03-23 18:18:30.023  INFO 1 --- [           main] e.p.PluginConfigEnvironmentPostProcessor : Loaded 0 plugin configuration files
2022-03-23 18:18:30.032  INFO 1 --- [           main] c.f.v.FortifyVulnerabilityExporter       : Starting FortifyVulnerabilityExporter vunspecified using Java 17.0.2 on runner-2v-baay6-project-2399-concurrent-0 with PID 1 (/opt/FortifyVulnerabilityExporter/libs/FortifyVulnerabilityExporter.jar started by app in /app)
2022-03-23 18:18:30.033  INFO 1 --- [           main] c.f.v.FortifyVulnerabilityExporter       : The following profiles are active: default
2022-03-23 18:18:31.598  WARN 1 --- [           main] s.c.a.AnnotationConfigApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'fortifyVulnerabilityExporterRunnerFactory': Unsatisfied dependency expressed through field 'activeVulnerabilityLoaderFactory'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'activeVulnerabilityLoaderFactory': Unsatisfied dependency expressed through field 'availableVulnerabilityLoaderFactories'; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'java.util.Collection<com.fortify.vulnexport.api.vuln.loader.IVulnerabilityLoaderFactory>' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true)}
2022-03-23 18:18:31.6[58](https://gitlab.dso.xc.nga.mil/Rogue-Squadron-Sandbox/ssc-integration/test-content/juice-shop/-/jobs/675413#L58) ERROR 1 --- [           main] o.s.b.d.LoggingFailureAnalysisReporter   : 
***************************
APPLICATION FAILED TO START
***************************
Description:
Field availableVulnerabilityLoaderFactories in com.fortify.vulnexport.api.vuln.loader.active.ActiveVulnerabilityLoaderFactory required a bean of type 'java.util.Collection' that could not be found.
The injection point has the following annotations:
	- @org.springframework.beans.factory.annotation.Autowired(required=true)
Action:
Consider defining a bean of type 'java.util.Collection' in your configuration.

So, I think I'm missing something in my Dockerfile. Is there a reference Dockerfile, or is this project's GitHub Actions content the closest I'm going to get?

@rsenden
Copy link
Contributor Author

rsenden commented Mar 24, 2022

The standard Docker images are built directly by Gradle; if you want to build a custom image from scratch, following is a snippet that I use in another Dockerfile:

ARG FTI_VERSION=v2.8.0
ARG FTI_DOWNLOAD=/tmp/FortifyToolsInstaller.sh
ARG FTI_DOWNLOAD_SHA256=367400379a228658c722260b779389102d5d44e1612c51ef1db7055d91930191
ARG FVE_VERSION=v1.5.4
ARG FTI_TOOLS=FVE:$FVE_VERSION

ARG FORTIFY_TOOLS_HOME=/opt/Fortify
ARG FVE_HOME=$FORTIFY_TOOLS_HOME/FortifyVulnerabilityExporter

ENV PATH="${PATH}:$FVE_HOME/bin"

RUN apt-get install -y unzip \
  && curl -fsSL https://raw.githubusercontent.com/fortify/FortifyToolsInstaller/${FTI_VERSION}/FortifyToolsInstaller.sh -o ${FTI_DOWNLOAD} \
  && echo "Checking download hash" \
  && echo "Expecting ${FTI_DOWNLOAD_SHA256}" \
  && echo "Actual $(sha256sum ${FTI_DOWNLOAD} | head -c64)" \
  && echo "${FTI_DOWNLOAD_SHA256} ${FTI_DOWNLOAD}" | sha256sum --check - \
  && /bin/bash -c "FTI_VARS_OUT=verify source /tmp/FortifyToolsInstaller.sh" \
  && rm /tmp/FortifyToolsInstaller.sh \
  && apt-get remove -y --purge unzip

I think GitLab expects a container with no entrypoint, but rather a container where it can start a shell to run the script steps; this is the main cause of this particular issue. So, for your custom Docker image, it's probably best to not have an ENTRYPOINT or CMD instruction, and invoke FortifyVulnerabilityExporter from the script section in your GitLab pipeline.

As for the particular error that you are seeing, this is because the plugin jar files cannot be found in one of the default locations (notice the first INFO message in your output, stating that 0 plugin configuration files have been loaded). As of version 1.5.0, plugin jars are bundled with the main application jar so you should no longer have that issue.

@ferricoxide
Copy link

Thank you for the response.

Yeah, working with containerized tools in GitLab has been fun. Previously, I'd written a Python-based utility that I'd intended to use as both a standalone tool (hand-executable with commandline switches and flags) as well as a containerized application. Unfortunately, the GitLab runners always pass in that sh-based start-script which breaks tools that have any CLI option-processing capabilities.

My first stab at trying to get a container that meets their starting-point desire was to pull a copy of the (v1.4.1) DockerHub image to use as a base-stage, and then copy off the relevant bits (a really icky kludge):

COPY --from=base /app /app
COPY --from=base /config /config
COPY --from=base /default /default

And, while it produces a functional container, it doesn't solve my eventual security problem with getting an allowance for bringing in the Dockerhub-hosted container into the production environment. So, started down the path of trying to build a container from source using the recommended, gradle-based method:

RUN cd /root && \
    git clone https://github.com/fortify/FortifyVulnerabilityExporter.git -b v1.4.1 \
    cd FortifyVulnerabilityExporter && ./gradlew clean build \
    install -dDm 0755 /opt/FortifyVulnerabilityExporter \
    install -dDm 0755 -o root -g root /config \
    install -dDm 0755 -o root -g root /default \
    cd /default && ln -s ../config config \
    cd /root/FortifyVulnerabilityExporter/build/ && tar cf - . | \
      ( cd /opt/FortifyVulnerabilityExporter ; tar xvf - )

Unfortunately, that method didn't produce a functional image (gives me the previously-posted error).

@ferricoxide
Copy link

Ok, so, for what it's worth… In order for the 1.5.x content to function with GitLab, setting:

ENTRYPOINT [ "/bin/bash", "-c", "FortifyVulnerabilityExporter" ]

After having used:

RUN FTI_TOOLS="fve:v1.5.4" FORTIFY_HOME="/opt/Fortify" /root/FortifyToolsInstaller/FortifyToolsInstaller.sh

In my build container and having set the PATH-value in the final container of my multi-stage container-build results in a container that "just works" under GitLab-CI.

@rsenden
Copy link
Contributor Author

rsenden commented Mar 24, 2022

Interesting, that makes me wonder what happens with the sh that is usually passed by GitLab, any idea? How are you invoking the container from your GitLab pipeline?

@ferricoxide
Copy link

I'm doing a multi-step pipeline. I have it "stageless", with the export step (fortify_ssc-export) set to wait for the fortify_push2ssc step to complete. But, my pipeline is basically as is described in the SSC GitLab section of the documentation?

fortify_ssc-export:
  image: ${CI_REGISTRY}/${FORTIFY_CONTAINER_EXPORTER}
  needs:
    - job: fortify_push2ssc
      artifacts: false
  variables:
    export_config: /config/SSCToGitLab.yml
    ssc_baseUrl: "https://${SSC_URL}"
    ssc_authToken: "${SSC_TRANSFER_TOKEN}"
    ssc_version_name: "${SSC_APP_NAME}:${SSC_APP_VERS:-${CI_BUILD_REF_NAME}}"
    _JAVA_OPTIONS: "-Djavax.net.ssl.trustStore=/certs/java/cacerts"
  script:
    - echo "Script not executed but required for CI-def to pass lint"
  allow_failure: true
  artifacts:
    paths:
      - gl-fortify-sast.json
      - gl-fortify-dast.json
      - gl-fortify-depscan.json
    reports:
      sast: gl-fortify-sast.json
      dast: gl-fortify-dast.json
      dependency_scanning: gl-fortify-depscan.json

The group that requested my automation-assistance wanted the exported content not only made available in the GitLab security dashboards, but also as downloadable artifacts …even though I have an earlier step ‐ which runs concurrent to the fortify_push2ssc step – that generates PDF files from the FPR files and uploads them as artifacts.

Note: being in a private VPC, the environment's services leverage a private CA which further necessitates the use of the _JAVA_OPTIONS statement.

@POPINACAP
Copy link

I know this is an old issue, but I thought I would leave a solution that worked for me in case others are still having this issue with the newer versions like I did.
So if you drop to an empty entrypoint and then invoke the original entrypoint in the script section of your .gitlab-ci.yml this issue gets bypassed.
Here is an example of what the .gitlab-ci.yml will look like

fortify_export:
  image: 
    name: fortifydocker/fortify-vulnerability-exporter
    entrypoint: [""]
  variables: 
    export_config: /config/SSCToGitlab.yml
    ssc_baseUrl: ${SSC_BASE_URL}
    ssc_authToken: ${SSC_CI_TOKEN_DECODED}
    ssc_version_id: ${SSC_APP_ID}
  script: 
    - java -DpopulateContainerDirs=true -cp "/app/classpath/*:app/libs/*" com.fortify.vulnexport.FortifyVulnerabilityExporter
  allow_failure: true
  artifacts: 
    reports: 
      sast: gl-fortify-sast.json
    paths: 
      - gl-fortify-sast.json
    when: always

Hopefully this helps

@takelley1
Copy link

takelley1 commented Feb 22, 2023

A docker-in-docker runner will also work:

---

fortify_scanning:
  tags:
    - dockerindocker
  script:
    - export name="$(date +%s)"
    - docker run --name "${name}" fortifydocker/fortify-vulnerability-exporter SSCToGitLab --ssc.baseUrl=https://example.com/ssc --ssc.authToken=${SSC_CI_TOKEN_DECODED} --ssc.version.name=${SSC_APP_ID}
    - docker cp "${name}:/export" ./
  allow_failure: true
  artifacts: 
    reports: 
      sast: ./export/gl-fortify-sast.json
      dast: ./export/gl-fortify-dast.json
      dependency_scanning: ./export/gl-fortify-depscan.json

@rsenden
Copy link
Contributor Author

rsenden commented Feb 27, 2023

Thanks everyone for listing some work-arounds. At some point, I hope to find some time to come up with / document a proper solution.

@rsenden
Copy link
Contributor Author

rsenden commented Jun 21, 2024

I somewhat forgot about this issue until someone ran into this problem again; I've now updated the documentation to list the work-around mentioned in #30 (comment).

As similar functionality is now available in fcli and FortifyVulnerabilityExporter will eventually be deprecated, we'll just stick to this work-around for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants