Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deploy some SLC dependencies in own S3 Bucket #581

Open
tomuben opened this issue May 2, 2022 · 0 comments
Open

Deploy some SLC dependencies in own S3 Bucket #581

tomuben opened this issue May 2, 2022 · 0 comments
Labels
refactoring Code improvement without behavior change

Comments

@tomuben
Copy link
Collaborator

tomuben commented May 2, 2022

Background

SLC have tons of dependencies, from different sources (Ubuntu, CRAN, PyPi, GCloud(Swig), Docker, Github , Exasol Website).
Many times one of those sources has connection issues, and hence the Nightly/CI build break.
We could try to host some small, stable artifacts on S3, and pull them during the build from there. Currently this is already done for SWIG. Also, some Exasol specific artificats could be hosted there (Check https://console.cloud.google.com/cloud-build/builds;region=global/fcae648f-da77-4ec6-bee3-a8ddf0a77312?project=script-languages-build)

Acceptance Criteria

  1. Create a public bucket on S3 (public readonly access)
  2. Deploy specific artifacts there:
  • Exasol ODBC driver
  • SWIG
  1. Change path in Dockerfiles to pull artifacts from there

Note: Exasol JDBC Driver will be moved to Maven Central, and can be fetched from there.

@tomuben tomuben added the refactoring Code improvement without behavior change label May 2, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
refactoring Code improvement without behavior change
Projects
None yet
Development

No branches or pull requests

1 participant