Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Update pipeline step messaging to be compatible with Java 17 #391

Closed
8 tasks done
carter-cundiff opened this issue Oct 3, 2024 · 3 comments · Fixed by #407
Closed
8 tasks done

Feature: Update pipeline step messaging to be compatible with Java 17 #391

carter-cundiff opened this issue Oct 3, 2024 · 3 comments · Fixed by #407
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@carter-cundiff
Copy link
Contributor

carter-cundiff commented Oct 3, 2024

Description

Follow on to #133, where we want to migrate our whole repository to java 17. Foundation-messaging was completed as part of #355, so this issue will focus on migrating the messaging implementation (extensions-messaging) to enable messaging within downstream projects.

DOD

  • Resolve messaging issue within spark data delivery pipelines
    • Create Baton migration if applicable
  • Update extensions-messaging submodules to build with JDK 17
    • Ensure all pom dependencies are compatible
      • Update to jakarta if applicable
      • Update to official java 17 version if applicable
    • Clean up pom to remove unnecessary dependencies and add undeclared transient dependencies
      • Use mvn dependency:analyze

Test Strategy/Script

Test New Project

  • Update your machine to build with JDK 17. Run mvn -v to verify it's configured properly. The output should be similar to the following (Note: java distributions different thank openjdk are okay):
$ mvn -v
Apache Maven 3.9.6 (bc0240f3c744dd6b6ec2920b3cd08dcc295161ae)
Maven home: /path/to/apache-maven-3.9.6-bin/3311e1d4/apache-maven-3.9.6
Java version: 17.0.8.1, vendor: Private Build, runtime: /usr/lib/jvm/java-17-openjdk-amd64
  • OTS Only:

    • Within the aiSSEMBLE repo, run the following and verify it builds successfully:
    mvn clean install -pl :foundation-upgrade,:foundation-mda,:extensions-messaging-kafka -am -Dmaven.build.cache.skipCache
    
  • Create a downstream project:

mvn archetype:generate -U -DarchetypeGroupId=com.boozallen.aissemble \
  -DarchetypeArtifactId=foundation-archetype \
  -DarchetypeVersion=1.10.0-SNAPSHOT \
  -DgroupId=com.test \
  -DartifactId=test-391 \
  -DprojectGitUrl=test.url \
  -DprojectName=test-391 \
  && cd test-391
  • Add the attached SparkPipelineMessaging.json to the test-391-pipeline-models/src/main/resources/pipelines/ directory

  • Run mvn clean install until all the manual actions are complete

  • Update the protected String executeStepImpl(String inbound) method within test-391-pipelines/spark-pipeline-messaging/src/main/java/com/test/SparkSyncStep.java to have the following:

    @Override
    protected String executeStepImpl(String inbound) {
        logger.info("Message received: {}", inbound);

        return "Exit message";
    }
  • Run mvn clean install -Dmaven.build.cache.skipCache to get any remaining manual actions
  • tilt up
  • Once all the resources are ready on the tilt ui, start the spark-pipeline-messaging resource
  • Wait until you see a log similar to the following, should be the last log output:
Resetting offset for partition SparkInboundChannel-0 to position FetchPosition
  • In a seperate terminal, exec into the kafka pod: kubectl exec -it kafka-cluster-0 -- sh
  • From the kafka pod, run the following:
/opt/bitnami/kafka/bin/kafka-console-producer.sh --bootstrap-server localhost:9093  --topic SparkInboundChannel
  • Then input the following: {"test":"InboundMessage"}, then hit <ENTER> once and <CTRL+C> to exit
  • Check the spark-pipeline-messaging resource in tilt and verify you now see the following logs:
INFO SparkSyncStep: Message received: {"test":"InboundMessage"}
  • From the kafka pod, run the following:
/opt/bitnami/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9093 --topic SparkOutboundChannel --from-beginning
  • Verify it returns the following: Exit message

  • Use <CTRL+C> to exit and enter exit to leave the pod

  • tilt down

  • kubectl delete pvc data-kafka-cluster-0

Test Upgrading Project

  • Update your java version to JDK 11. Run mvn -v to verify it's configured properly
  • Create a downstream project from 1.9.2:
mvn archetype:generate -U -DarchetypeGroupId=com.boozallen.aissemble \
  -DarchetypeArtifactId=foundation-archetype \
  -DarchetypeVersion=1.9.2 \
  -DgroupId=com.test \
  -DartifactId=test-391-upgrade \
  -DprojectGitUrl=test.url \
  -DprojectName=test-391-upgrade \
  && cd test-391-upgrade
  • Add the attached SparkPipelineMessaging.json to the test-391-upgrade-pipeline-models/src/main/resources/pipelines/ directory

  • Run mvn clean install until all the manual actions are complete

  • Run mvn clean install -Dmaven.build.cache.skipCache to get any remaining manual actions

  • Update the parent in the root pom.xml to 1.10.0-SNAPSHOT

  • Update the smallrye-reactive-messaging-kafka dependency to look like the following within test-391-upgrade-pipelines/spark-pipeline-messaging/pom.xml (Workaround from Feature: Upgrade Quarkus to 3.6+ #263 bug):

        <dependency>
            <groupId>io.smallrye.reactive</groupId>
            <artifactId>smallrye-reactive-messaging-kafka</artifactId>
            <version>${version.smallrye.reactive.messaging}</version>
        </dependency>
  • Update your java version to JDK 17. Run mvn -v to verify it's configured properly
  • Run the following to perform the baton migrations:
mvn org.technologybrewery.baton:baton-maven-plugin:baton-migrate
  • Verify the test-391-upgrade-pipelines/spark-pipeline-messaging/src/main/java/com/test/cdi/CdiContainerFactory.java now has the following:
+import com.boozallen.aissemble.messaging.core.cdi.MessagingCdiContext;
+import com.boozallen.aissemble.kafka.context.KafkaConnectorCdiContext;

...

    protected static List<CdiContext> getContexts() {
        List<CdiContext> contexts = new ArrayList<>();
+       contexts.add(new MessagingCdiContext());
+       contexts.add(new KafkaConnectorCdiContext());
        contexts.add(new PipelinesCdiContext());

        return contexts;
    }
  • Verify the test-391-upgrade-pipelines/spark-pipeline-messaging/pom.xml now has the following:
    <dependencies>

...

+        <dependency>
+           <groupId>com.boozallen.aissemble</groupId>
+           <artifactId>extensions-messaging-kafka</artifactId>
+     </dependency>
    </dependencies>

References/Additional Context

@carter-cundiff carter-cundiff added the enhancement New feature or request label Oct 3, 2024
@carter-cundiff carter-cundiff added this to the 1.10.0 milestone Oct 3, 2024
@carter-cundiff carter-cundiff self-assigned this Oct 3, 2024
@carter-cundiff
Copy link
Contributor Author

DOD with @csun-cpointe

@carter-cundiff
Copy link
Contributor Author

OTS with @csun-cpointe

@carter-cundiff carter-cundiff changed the title Feature: Update pipeline step messaging for Java 17 Feature: Update pipeline step messaging to be compatible with Java 17 Oct 10, 2024
carter-cundiff added a commit that referenced this issue Oct 10, 2024
#391 Update pipeline step messaging to be compatible with Java 17
ewilkins-csi added a commit that referenced this issue Oct 10, 2024
In ad25749 we added the extensions-messaging-kafka dependency to
pipelines by default so the Kafka CDI context could be added in
CdiContainerFactory. The test-mda-model projects do not regenerate their
POM files on rebuild, so we need to add the new dependencies manually.
ewilkins-csi added a commit that referenced this issue Oct 10, 2024
[#391] add kafka dependency to test projects
@nartieri
Copy link
Collaborator

Both tests passed successfully! 🎉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants