Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feat][Spark] Split datasources and core, prepare for support of multiple spark versions #369

Merged
merged 19 commits into from
Feb 22, 2024

Conversation

SemyonSinchenko
Copy link
Member

Proposed changes

  • split datasources and core GraphAr
  • introduce Maven profiles for different versions of Spark
  • small fixes of PySpark part due to new naming and paths
  • new pom.xml files for subprojects

Checklist

Put an x in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your code.

  • I have read the CONTRIBUTING doc
  • I have signed the CLA
  • Lint and unit tests pass locally with my changes
  • I have added tests that prove my fix is effective or that my feature works
  • I have added necessary documentation (if appropriate)

Further comments

Part of discussion in #320 and #366

- split datasources and core GraphAr
- introduce Maven profiles for different versions of Spark
- small fixes of PySpark part due to new naming and paths
- new pom.xml files for subprojects

 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   pyspark/tests/conftest.py
	new file:   spark/datasources-32/pom.xml
	renamed:    spark/src/main/java/com/alibaba/graphar/GeneralParams.java -> spark/datasources-32/src/main/java/com/alibaba/graphar/GeneralParams.java
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/GarCommitProtocol.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/GarCommitProtocol.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/GarDataSource.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/GarDataSource.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/GarScan.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/GarScan.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/GarScanBuilder.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/GarScanBuilder.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/GarTable.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/GarTable.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/GarWriterBuilder.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/GarWriterBuilder.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/csv/CSVWriterBuilder.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/csv/CSVWriterBuilder.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/orc/OrcOutputWriter.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/orc/OrcOutputWriter.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/orc/OrcWriteBuilder.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/orc/OrcWriteBuilder.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/datasources/parquet/ParquetWriterBuilder.scala -> spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/parquet/ParquetWriterBuilder.scala
	new file:   spark/graphar/pom.xml
	new file:   spark/graphar/src/main/java/com/alibaba/graphar/GeneralParams.java
	renamed:    spark/src/main/scala/com/alibaba/graphar/EdgeInfo.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/EdgeInfo.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/GraphInfo.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/GraphInfo.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/VertexInfo.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/VertexInfo.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/example/GraphAr2Nebula.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/example/GraphAr2Nebula.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/example/GraphAr2Neo4j.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/example/GraphAr2Neo4j.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/example/Nebula2GraphAr.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/example/Nebula2GraphAr.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/example/Neo4j2GraphAr.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/example/Neo4j2GraphAr.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/graph/GraphReader.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/graph/GraphReader.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/graph/GraphTransformer.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/graph/GraphTransformer.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/graph/GraphWriter.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/graph/GraphWriter.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/importer/Neo4j.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/importer/Neo4j.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/reader/EdgeReader.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/reader/EdgeReader.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/reader/VertexReader.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/reader/VertexReader.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/util/DataFrameConcat.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/util/DataFrameConcat.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/util/FileSystem.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/util/FileSystem.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/util/IndexGenerator.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/util/IndexGenerator.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/util/Patitioner.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/util/Patitioner.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/util/Utils.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/util/Utils.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/writer/EdgeWriter.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/writer/EdgeWriter.scala
	renamed:    spark/src/main/scala/com/alibaba/graphar/writer/VertexWriter.scala -> spark/graphar/src/main/scala/com/alibaba/graphar/writer/VertexWriter.scala
	new file:   spark/graphar/src/test/resources/gar-test
	renamed:    spark/src/test/scala/com/alibaba/graphar/ComputeExample.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/ComputeExample.scala
	renamed:    spark/src/test/scala/com/alibaba/graphar/TestGraphInfo.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/TestGraphInfo.scala
	renamed:    spark/src/test/scala/com/alibaba/graphar/TestGraphReader.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/TestGraphReader.scala
	renamed:    spark/src/test/scala/com/alibaba/graphar/TestGraphTransformer.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/TestGraphTransformer.scala
	renamed:    spark/src/test/scala/com/alibaba/graphar/TestGraphWriter.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/TestGraphWriter.scala
	renamed:    spark/src/test/scala/com/alibaba/graphar/TestIndexGenerator.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/TestIndexGenerator.scala
	renamed:    spark/src/test/scala/com/alibaba/graphar/TestReader.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/TestReader.scala
	renamed:    spark/src/test/scala/com/alibaba/graphar/TestWriter.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/TestWriter.scala
	renamed:    spark/src/test/scala/com/alibaba/graphar/TransformExample.scala -> spark/graphar/src/test/scala/com/alibaba/graphar/TransformExample.scala
	modified:   spark/pom.xml
	deleted:    spark/src/test/resources/gar-test
 On branch 320-datasources-refactoring
 Changes to be committed:
	new file:   spark/datasources-32/.scalafmt.conf
	modified:   spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources/GarDataSource.scala
	new file:   spark/graphar/.scalafmt.conf
	modified:   spark/graphar/pom.xml
	modified:   spark/pom.xml
 On branch 320-datasources-refactoring
 Changes to be committed:
	typechange: spark/datasources-32/.scalafmt.conf
	typechange: spark/graphar/.scalafmt.conf
 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   .licenserc.yaml
 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   spark/graphar/src/test/resources/gar-test
 On branch 320-datasources-refactoring
 Changes to be committed:
	typechange: spark/datasources-32/.scalafmt.conf
	typechange: spark/graphar/.scalafmt.conf
@SemyonSinchenko SemyonSinchenko self-assigned this Feb 21, 2024
 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   .licenserc.yaml
	modified:   spark/scripts/run-graphar2nebula.sh
	modified:   spark/scripts/run-graphar2neo4j.sh
	modified:   spark/scripts/run-nebula2graphar.sh
	modified:   spark/scripts/run-neo4j2graphar.sh
 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   .licenserc.yaml
 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   spark/import/neo4j.sh
@SemyonSinchenko
Copy link
Member Author

May someone help me with .licenserc.yaml? What should I put here to force it ignoring any pattern like spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources or spar/datasources-33/... or any other version of datasources that will be here when we extend the list of supported spark versions?

I tried 'spark/datasources-*/src/main/scala/com/alibaba/graphar/datasources' but it is not working as one may see.

@acezen
Copy link
Contributor

acezen commented Feb 22, 2024

May someone help me with .licenserc.yaml? What should I put here to force it ignoring any pattern like spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources or spar/datasources-33/... or any other version of datasources that will be here when we extend the list of supported spark versions?

I tried 'spark/datasources-*/src/main/scala/com/alibaba/graphar/datasources' but it is not working as one may see.

It seems that skywwalking-eyes does not support path pattern like datasources-*, you can just use the full path like

  • spark/datasources-32/src/main/scala/com/alibaba/graphar/datasources
  • spark/datasources-33/src/main/scala/com/alibaba/graphar/datasources

Copy link
Contributor

@acezen acezen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks to Sem, the changes LGTM, I left some comments and please make the CI pass before merging the PR.

 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   .licenserc.yaml
	typechange: spark/datasources-32/src/main/java/com/alibaba/graphar/GeneralParams.java
 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   spark/pom.xml
@acezen
Copy link
Contributor

acezen commented Feb 22, 2024

- added direct export of JAVA_HOME=JAVA_11 because it works for tests

 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   docs/Makefile
@SemyonSinchenko
Copy link
Member Author

The docs generate command [here]

https://github.com/alibaba/GraphAr/blob/4158bb8167694b0bca5d799c9ce3f52b71b61141/docs/Makefile#L40
may need to update.

To be honest I surprised that it is not working. The same command works fine in my local setup. Even with mvn clean scala:doc

--no-transfer-progress should make GHA logs little more readable

 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   docs/Makefile
 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   docs/Makefile
docs/Makefile Outdated Show resolved Hide resolved
- add --no-transfer-progress to every mvn command in CI
- move JAVA_HOME export from Makefile level to CI level

 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   .github/workflows/docs.yml
	modified:   .github/workflows/spark.yaml
	modified:   docs/Makefile
	modified:   pyspark/Makefile
@SemyonSinchenko
Copy link
Member Author

The last idea in my mind to add mvn package before the mvn scala:doc...

 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   docs/Makefile
@SemyonSinchenko
Copy link
Member Author

I'm out of ideas. @acezen may you try to run mvc scala:doc from the spark folder locally? Is it working for you? Because it is crazy, why Reactor build works fine for test/compile/package but the same setup failed on scala:doc

@acezen
Copy link
Contributor

acezen commented Feb 22, 2024

I'm out of ideas. @acezen may you try to run mvc scala:doc from the spark folder locally? Is it working for you? Because it is crazy, why Reactor build works fine for test/compile/package but the same setup failed on scala:doc

I have tried and I got the same problem with CI. I trying exclude the dependency when generate doc, hope it work

@acezen
Copy link
Contributor

acezen commented Feb 22, 2024

I'm out of ideas. @acezen may you try to run mvc scala:doc from the spark folder locally? Is it working for you? Because it is crazy, why Reactor build works fine for test/compile/package but the same setup failed on scala:doc

I have tried and I got the same problem with CI. I trying exclude the dependency when generate doc, hope it work

Try mvn clean install -DskipTests before mvn scala:doc, this solution works on my enviroment.

 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   docs/Makefile
 On branch 320-datasources-refactoring
 Changes to be committed:
	modified:   docs/Makefile
@SemyonSinchenko
Copy link
Member Author

Finally, everything is green! 🎉

Copy link
Contributor

@acezen acezen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@acezen acezen merged commit bdcf367 into apache:main Feb 22, 2024
4 checks passed
@SemyonSinchenko SemyonSinchenko deleted the 320-datasources-refactoring branch April 22, 2024 17:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants