Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: JDK 17 Upgrade Support Series - Pipeline Invocation Service #403

Closed
10 tasks done
csun-cpointe opened this issue Oct 9, 2024 · 3 comments · Fixed by #416
Closed
10 tasks done

Feature: JDK 17 Upgrade Support Series - Pipeline Invocation Service #403

csun-cpointe opened this issue Oct 9, 2024 · 3 comments · Fixed by #416
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@csun-cpointe
Copy link
Contributor

csun-cpointe commented Oct 9, 2024

Description

In #133 we modified the core libraries t to use JDK 17. As part of upgrade to java 17 series, this ticket will focus on the pipeline invocation service functionality.

DOD

  • Upgrade pipeline invocation service to build with Java 17:
    • Ensure all modules are compatible with java 17
      • extensions-docker-pipeline-invocation
      • extensions-pipeline-invocation-service
      • extensions-helm-pipeline-invocation-service
    • Clean up pom to remove unnecessary dependencies
    • Pipeline invocation service functionalities work as expected
      • Pipeline can be started through rest api
      • Pipeline rest health check works
      • Pipeline can be started through Kafka

Test Strategy/Script

How will this feature be verified?

  1. Create a new project on 1.10.0-SNAPSHOT.

    mvn archetype:generate '-DarchetypeGroupId=com.boozallen.aissemble' \
                           '-DarchetypeArtifactId=foundation-archetype' \
                           '-DarchetypeVersion=1.10.0-SNAPSHOT' \
                           '-DgroupId=org.test' \
                           '-Dpackage=org.test' \
                           '-DprojectGitUrl=test.org/test.git' \
                           '-DprojectName=Final Test 403' \
                           '-DartifactId=test-403-final' \
    && cd test-403-final
  2. Set your Java version to 17 if it is not currently

  3. Under -model/src/main/resources/pipelines add both PysparkPipeline.json and SparkPipeline.json files

  4. Fully generate the project by running mvn clean install and following manual actions

  5. Unzip the values-dev.yaml.zip and replace the -deplpy/src/main/resources/apps/pipeline-invocation-service/values-dev.yaml file

  6. Build the project without the cache and follow the last manual action.

    mvn clean install -Dmaven.build.cache.skipCache
  7. Deploy the project.

    tilt up; tilt down
  8. Once all the resources have started (may take awhile), check if the pipeline-invocation-service is healthy (Note: if you got connection refused, use Rancher Desktop to forward the invocation-rest port to 30004)
    Use postman or any rest client to make invocation rest call and verify the Service is available is responded.

  9. Invoke the pipeline via the pipeline-invocation-service rest call

  1. Verify the spark-pipeline is successfully invoked
  • verify the below content shown in the pipeline-invocation-service log
2024-10-16 17:35:41,224 INFO  [com.boo.ais.pip.inv.ser.end.HttpEndpoint] (executor-thread-1) Received HTTP request to submit spark-pipeline.
2024-10-16 17:35:41,232 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (executor-thread-1) Executing Helm command: helm uninstall spark-pipeline
2024-10-16 17:35:41,344 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) release "spark-pipeline" uninstalled
2024-10-16 17:35:41,351 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (executor-thread-1) Executing Helm command: helm install spark-pipeline oci://ghcr.io/boozallen/aissemble-spark-application-chart --version 1.10.0-SNAPSHOT --values /deployments/sparkApplicationValues/spark-pipeline-base-values.yaml --values /deployments/sparkApplicationValues/spark-pipeline-dev-values.yaml --set service.enabled=false
2024-10-16 17:35:44,198 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) Pulled: ghcr.io/boozallen/aissemble-spark-application-chart:1.10.0-SNAPSHOT
2024-10-16 17:35:44,198 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) Digest: sha256:dbb62c0fe25f64017ed1be12762deaa9afcd948e5a63a2d38ade45bda9bdfb11
2024-10-16 17:35:44,383 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) NAME: spark-pipeline
2024-10-16 17:35:44,383 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) LAST DEPLOYED: Wed Oct 16 17:35:44 2024
2024-10-16 17:35:44,383 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) NAMESPACE: default
2024-10-16 17:35:44,384 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) STATUS: deployed
2024-10-16 17:35:44,384 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) REVISION: 1
2024-10-16 17:35:44,384 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) TEST SUITE: None
2024-10-16 17:35:44,387 INFO  [com.boo.ais.pip.inv.ser.end.HttpEndpoint] (executor-thread-1) Submitted spark-pipeline for processing.

  • Run kubectl logs -f spark-pipeline-driver to verify pipeline is successful executed
24/10/16 17:36:17 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
24/10/16 17:36:17 INFO SparkContext: Successfully stopped SparkContext
  1. Invoke the pipeline via the kafka to send the invocation pipeline message

    • Run kubectl exec -it kafka-cluster-0 -- sh to go in the kafka container
    • Run /opt/bitnami/kafka/bin/kafka-console-producer.sh --bootstrap-server localhost:9093 --topic pipeline-invocation to enable send message to pipeline-invocation topic
    • Send {"applicationName":"pyspark-pipeline"} message to invoke the pyspark-pipeline
  2. Verify that the pyspark-pipeline is successfully invoked.

  • verify the below content shown in the pipeline-invocation-service log
2024-10-16 17:39:15,347 INFO  [com.boo.ais.pip.inv.ser.end.MessageEndpoint] (pool-5-thread-1) Received message request to submit pyspark-pipeline.
2024-10-16 17:39:15,349 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (pool-5-thread-1) Executing Helm command: helm uninstall pyspark-pipeline
2024-10-16 17:39:15,523 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) release "pyspark-pipeline" uninstalled
2024-10-16 17:39:15,534 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (pool-5-thread-1) Executing Helm command: helm install pyspark-pipeline oci://ghcr.io/boozallen/aissemble-spark-application-chart --version 1.10.0-SNAPSHOT --values /deployments/sparkApplicationValues/pyspark-pipeline-base-values.yaml --values /deployments/sparkApplicationValues/pyspark-pipeline-dev-values.yaml --set service.enabled=false
2024-10-16 17:39:18,321 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) Pulled: ghcr.io/boozallen/aissemble-spark-application-chart:1.10.0-SNAPSHOT
2024-10-16 17:39:18,321 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) Digest: sha256:dbb62c0fe25f64017ed1be12762deaa9afcd948e5a63a2d38ade45bda9bdfb11
2024-10-16 17:39:18,537 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) NAME: pyspark-pipeline
2024-10-16 17:39:18,539 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) LAST DEPLOYED: Wed Oct 16 17:39:18 2024
2024-10-16 17:39:18,539 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) NAMESPACE: default
2024-10-16 17:39:18,540 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) STATUS: deployed
2024-10-16 17:39:18,540 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) REVISION: 1
2024-10-16 17:39:18,540 INFO  [com.boo.ais.pip.inv.ser.uti.exe.HelmCommandExecutor] (Exec Stream Pumper) TEST SUITE: None
2024-10-16 17:39:18,545 INFO  [com.boo.ais.pip.inv.ser.end.MessageEndpoint] (pool-5-thread-1) Submitted pyspark-pipeline for processing.
  • Run kubectl logs -f pyspark-pipeline-driver to verify pipeline is successful executed
2024/10/16 17:39:30 INFO IngestBase: START: step execution...
2024/10/16 17:39:30 WARNING Ingest: Implement execute_step_impl(..) or remove this pipeline step!
2024/10/16 17:39:30 INFO IngestBase: COMPLETE: step execution completed in 0.133125ms
@csun-cpointe csun-cpointe added the enhancement New feature or request label Oct 9, 2024
@csun-cpointe csun-cpointe self-assigned this Oct 9, 2024
@csun-cpointe csun-cpointe added this to the 1.10.0 milestone Oct 9, 2024
@csun-cpointe
Copy link
Contributor Author

DoD completed with @ewilkins-csi

@carter-cundiff
Copy link
Contributor

OTS passed

@ewilkins-csi
Copy link
Contributor

Final Test passed ✅

  • REST healthcheck was successful
    Screenshot 2024-10-18 at 8 58 06 AM
  • REST submit was successful
    Screenshot 2024-10-18 at 8 59 28 AM
  • Spark pipeline was submitted
    Screenshot 2024-10-18 at 9 00 08 AM
  • Spark pipeline finished successfully
    Screenshot 2024-10-18 at 9 00 46 AM
  • Pyspark pipeline was submitted
    Screenshot 2024-10-18 at 9 01 56 AM
  • Pyspark pipeline finished successfully
    Screenshot 2024-10-18 at 9 03 43 AM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants