Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
104 commits
Select commit Hold shift + click to select a range
22c0ffa
[HUDI-5373] Different fileids are assigned to the same bucket (#7433)
loukey-lj Dec 13, 2022
5744adb
[HUDI-4113] Fix cannot parse <null> schema when use spark delete sql …
KnightChess Dec 13, 2022
e4aea74
[HUDI-5318] Fix partition pruning for clustering scheduling (#7366)
stream2000 Dec 14, 2022
d635eed
[HUDI-5412] Send the boostrap event if the JM also rebooted (#7497)
danny0405 Dec 18, 2022
8a64065
[HUDI-5414] No need to guard the table initialization by lock for Hoo…
danny0405 Dec 20, 2022
ab4d5a5
Fixing compilation errors
nsivabalan Mar 22, 2023
8427a2d
Revert "[HUDI-5409] Avoid file index and use fs view cache in COW inp…
codope Dec 24, 2022
2264b58
[HUDI-5400] Fix read issues when Hudi-FULL schema evolution is not en…
voonhous Dec 24, 2022
1e645cf
[HUDI-5411] Avoid virtual key info for COW table in the input format …
codope Dec 24, 2022
a086f01
[HUDI-4827] Upgrade Azure CI to Ubuntu 22.04 and scalatest-maven-plug…
lokeshj1703 Dec 27, 2022
875fd50
Remove minlog.Log (#7441)
cxzl25 Dec 28, 2022
bd6544d
[HUDI-5420] Fix metadata table validator to exclude uncommitted log f…
yihua Dec 29, 2022
c2df252
[MINOR] Fix doap file syntax and date (#7586)
xushiyan Jan 2, 2023
1bb44a0
[HUDI-5489] Flink offline compactor throws exception in service mode …
danny0405 Jan 3, 2023
cec8cd1
[HUDI-5477] Optimize timeline loading in Hudi sync client (#7561)
yihua Jan 5, 2023
7332549
[HUDI-5160] Fix data source write save as table (#7448)
xushiyan Jan 5, 2023
fe1ebaf
[HUDI-5341] CleanPlanner retains earliest commits must not be later t…
SteNicholas Jan 6, 2023
8ba5aee
[HUDI-5506] StreamWriteOperatorCoordinator may not recommit with part…
SteNicholas Jan 6, 2023
c6a3677
[HUDI-5192] add non-code file extensions to ignore list (#7597)
jonvex Jan 6, 2023
87371a0
[HUDI-5231] suppress checkstyle warnings (#7473)
jonvex Jan 6, 2023
ef661db
[MINOR] Set engine when creating meta write config (#7575)
xccui Jan 7, 2023
eb71c98
[HUDI-5511] Do not clean the CkpMetadata dir when restart the job (#7…
danny0405 Jan 9, 2023
4d3df06
[HUDI-5504] Fix concurrency conflict for flink async compaction with …
ThinkerLei Jan 9, 2023
8c2082e
[HUDI-5484] Avoid using `GenericRecord` in `HoodieColumnStatMetadata`…
cxzl25 Jan 9, 2023
5dd4a8b
[HUDI-5515] Fix concurrency conflict in ClusteringOperator with laten…
ThinkerLei Jan 10, 2023
46b0c71
[HUDI-5434] Fix archival in metadata table to not rely on completed r…
yihua Jan 11, 2023
392420b
[HUDI-5381] Fix for class cast exception when running with Flink 1.15…
kkrugler Jan 12, 2023
fbf4884
[MINOR] Fix flaky tests in ITTestHoodieDataSource caused by unordered…
trushev Jan 12, 2023
d466f08
[MINOR] Add metastore_db/ into gitignore file (#7648)
jackwener Jan 12, 2023
89dc95f
[HUDI-5543] Description of clustering.plan.partition.filter.mode supp…
SteNicholas Jan 13, 2023
b3c50b9
[HUDI-5538] Fix ContinuousFileSource and ITTestDataStreamWrite for fl…
trushev Jan 13, 2023
34d5271
[HUDI-5538] Fix ContinuousFileSource and ITTestDataStreamWrite for fl…
nsivabalan Mar 22, 2023
f7e56b6
Fixing build failure due to unused imports
nsivabalan Mar 22, 2023
bc48ba2
[MINOR] Fix minor issues in HoodieMetadataTableValidator docs (#7518)
yihua Jan 13, 2023
11f9f08
[HUDI-5275] Fix reading data using the HoodieHiveCatalog will cause t…
danny0405 Jan 16, 2023
8b3c18f
[MINOR] Add database config for flink (#7682)
yuzhaojing Jan 17, 2023
56e3cc6
[HUDI-4710] Fix flaky: TestKeyRangeLookupTree#testFileGroupLookUpMany…
boneanxs Jan 17, 2023
5ecd01c
[HUDI-5433] Fix the way we deduce the pending instants for MDT writes…
codope Jan 18, 2023
d7eb124
[HUDI-5336] Fixing log file pattern match to ignore extraneous files …
nsivabalan Jan 20, 2023
9cad08b
Fixing build failure
nsivabalan Mar 22, 2023
99522dd
[HUDI-5516] Reduce memory footprint on workload with thousand active …
trushev Jan 20, 2023
4335361
[HUDI-5589] Fix Hudi config inference (#7713)
yihua Jan 20, 2023
78c0c14
[HUDI-5499] Fixing Spark SQL configs not being properly propagated fo…
alexeykudinkin Jan 20, 2023
909d7fc
[MINOR] Disable async clean in testCleanerDeleteReplacedDataWithArchi…
xushiyan Jan 20, 2023
e18ce3f
[HUDI-5407][HUDI-5408] Fixing rollback in MDT to be eager (#7490)
nsivabalan Jan 21, 2023
87b5802
[minor] Fix flink 1.15 build profile (#7731)
danny0405 Jan 23, 2023
48b33b7
Fixing FS `InputStream` leaks (#7741)
alexeykudinkin Jan 24, 2023
4bd62d4
[MINOR] Fixing `TestStructuredStreaming` test (#7736)
alexeykudinkin Jan 24, 2023
fc8219c
[HUDI-5593] Fixing deadlocks due to async cleaner awaiting for lock w…
nsivabalan Jan 24, 2023
dcbc038
[HUDI-5401] Ensure user-provided hive metastore uri is set in HiveCon…
codope Jan 24, 2023
c675c56
[HUDI-5380] Fixing change table path but table location in metastore …
BruceKellan Jan 26, 2023
8790f7f
[HUDI-5485] Add File System View API for batch listing and improve sa…
yihua Jan 26, 2023
a1f5380
[HUDI-5592] Fixing some of the flaky tests in CI (#7720)
nsivabalan Jan 27, 2023
0de2e66
[HUDI-5630] Fixing flaky parquet projection tests (#7768)
nsivabalan Jan 28, 2023
a67e5cf
[HUDI-5635] Fix release scripts (#7775)
yihua Jan 28, 2023
4b70756
[MINOR] Fix validate_staged_release.sh (#7780)
yihua Jan 28, 2023
795edb6
[HUDI-5640] Add missing profiles in `deploy_staging_jars.sh` (#7784)
yihua Jan 29, 2023
e75f830
[HUDI-5563] Check table exist before drop table (#7679)
Zouxxyy Jan 31, 2023
c3ecd72
[HUDI-5568] Fix the BucketStreamWriteFunction to rebase the local fil…
loukey-lj Jan 31, 2023
f2e7738
[HUDI-5655] Closing write client for spark ds writer in all cases (in…
nsivabalan Jan 31, 2023
2fabc14
[HUDI-5654] Fixing read of an empty rollback completed meta files fro…
nsivabalan Jan 31, 2023
d2a3689
[HUDI-5553] Prevent partition(s) from being dropped if there are pend…
voonhous Jan 31, 2023
c986ba4
[HUDI-5585][flink] Fix flink creates and writes the table, the spark …
waywtdcc Feb 1, 2023
06cce73
[HUDI-5681] Fixing Kryo being instantiated w/ invalid `SparkConf` (#7…
alexeykudinkin Feb 2, 2023
623a0b7
Fixing test failures and build issues
nsivabalan Mar 22, 2023
de29ac8
[HUDI-5676] Fix BigQuerySyncTool standalone mode (#7816)
xushiyan Feb 2, 2023
203c1b7
[HUDI-5671] BucketIndexPartitioner partition algorithm skew (#7815)
loukey-lj Feb 3, 2023
6d5482b
[HUDI-5329] Spark reads hudi table error when flink creates the table…
wuwenchi Feb 7, 2023
2ee82e0
[HUDI-5734] Fix flink batch read skip clustering data lost (#7903)
hbgstc123 Feb 10, 2023
83b2faa
[HUDI-5768] Fix Spark Datasource read of metadata table (#7924)
yihua Feb 12, 2023
5871d34
Fixing build failures
nsivabalan Mar 22, 2023
d293bab
[HUDI-4406] Support Flink compaction/clustering write error resolveme…
chenshzh Feb 14, 2023
d514aaf
[HUDI-5787] HMSDDLExecutor should set table type to EXTERNAL_TABLE wh…
SteNicholas Feb 15, 2023
9c76f73
[HUDI-5058] HoodieCatalog#getTable sets primary key with hoodie.datas…
SteNicholas Feb 20, 2023
d4b01f9
[HUDI-5729] Fix RowDataKeyGen method getRecordKey (#7894)
sandyfog Feb 20, 2023
905e087
Handle empty payloads for AbstractDebeziumAvroPayload (#7944)
jonvex Feb 21, 2023
eaef8fb
[HUDI-5850] Fix timestamp(6) field long overflow (#8052)
sandyfog Feb 27, 2023
e20928e
[HUDI-5855] Release resource actively for Flink hive meta sync (#8050)
XuQianJin-Stars Feb 27, 2023
068ac07
fixing build failures
nsivabalan Mar 22, 2023
0c061f4
[HUDI-5728] HoodieTimelineArchiver archives the latest instant before…
SteNicholas Mar 3, 2023
71f2dfe
[HUDI-5863] Fix HoodieMetadataFileSystemView serving stale view at th…
yihua Mar 4, 2023
01cf0b3
Fixing build failure
nsivabalan Mar 22, 2023
168892b
[HUDI-5333] Ignore file system type of basePath when using RocksDbBas…
1032851561 Mar 10, 2023
ef96ed8
[HUDI-5857] Insert overwrite into bucket table would generate new fil…
beyond1920 Mar 10, 2023
e0511b6
fixing build issues
nsivabalan Mar 22, 2023
1cb8e83
[HUDI-5917] Fix HoodieRetryWrapperFileSystem getDefaultReplication (#…
sandyfog Mar 11, 2023
3835aea
[HUDI-5785] Enhance Spark Datasource tests (#7938)
yihua Mar 18, 2023
1ddb559
fixing build failures
nsivabalan Mar 22, 2023
5157dbf
[HUDI-5950] Fixing pending instant deduction to trigger compaction in…
nsivabalan Mar 21, 2023
c60981d
[HUDI-5822] Fix bucket stream writer fileId not found exception (#8263)
voonhous Mar 22, 2023
481ebe9
[HUDI-5962] Adding timeline server support to integ test suite (#8248)
nsivabalan Mar 22, 2023
3d1febd
[MINOR] Fix flaky testStructuredStreamingWithCompaction
nsivabalan Mar 23, 2023
da85567
fix test failures
danny0405 Mar 23, 2023
d7f6b2a
[HUDI-5399] Flink mor table streaming read throws NPE (#7504)
danny0405 Dec 19, 2022
6ab5f54
[HUDI-5414] No need to guard the table initialization by lock for Hoo…
danny0405 Dec 20, 2022
b13d50c
[HUDI-3673] Clean up hbase shading dependencies (#7371)
xushiyan Jan 24, 2023
e7dbc84
[HUDI-5835] After performing the update operation, the hoodie table c…
xiarixiaoyao Feb 28, 2023
e68bfec
Fixing test failures
nsivabalan Mar 23, 2023
75bd3ab
Fixing test failures
nsivabalan Mar 24, 2023
8101540
Triaging a flaky test
nsivabalan Mar 24, 2023
a5482d7
fixing base path in tests
nsivabalan Mar 24, 2023
3949998
Fix schema handling for record type in write path
codope Mar 25, 2023
052c470
Restore test decimal type
codope Mar 25, 2023
7201aec
Fix avro schema evolution tests
codope Mar 25, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions .github/workflows/bot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,15 @@ on:
- master
- 'release-*'
pull_request:
paths-ignore:
- '**.bmp'
- '**.gif'
- '**.jpg'
- '**.jpeg'
- '**.md'
- '**.pdf'
- '**.png'
- '**.svg'
branches:
- master
- 'release-*'
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Directories #
/build/
target/
metastore_db/
.mvn/

# OS Files #
Expand Down
157 changes: 3 additions & 154 deletions azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ trigger:
- '*' # must quote since "*" is a YAML reserved character; we want a string

pool:
vmImage: 'ubuntu-18.04'
vmImage: 'ubuntu-22.04'

parameters:
- name: job1Modules
Expand All @@ -46,23 +46,7 @@ parameters:
- name: job4Modules
type: object
default:
- '!hudi-client/hudi-spark-client'
- '!hudi-common'
- '!hudi-examples'
- '!hudi-examples/hudi-examples-common'
- '!hudi-examples/hudi-examples-flink'
- '!hudi-examples/hudi-examples-java'
- '!hudi-examples/hudi-examples-spark'
- '!hudi-flink-datasource'
- '!hudi-flink-datasource/hudi-flink'
- '!hudi-flink-datasource/hudi-flink1.13.x'
- '!hudi-flink-datasource/hudi-flink1.14.x'
- '!hudi-flink-datasource/hudi-flink1.15.x'
- '!hudi-spark-datasource'
- '!hudi-spark-datasource/hudi-spark'
- '!hudi-spark-datasource/hudi-spark2'
- '!hudi-spark-datasource/hudi-spark2-common'
- '!hudi-spark-datasource/hudi-spark-common'
- 'hudi-utilities'

variables:
BUILD_PROFILES: '-Dscala-2.11 -Dspark2.4 -Dflink1.14'
Expand All @@ -80,96 +64,6 @@ variables:
stages:
- stage: test
jobs:
- job: UT_FT_1
displayName: UT FT common & flink & UT client/spark-client
timeoutInMinutes: '150'
steps:
- task: Maven@4
displayName: maven install
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean install'
options: $(MVN_OPTS_INSTALL)
publishJUnitResults: false
jdkVersionOption: '1.8'
- task: Maven@4
displayName: UT common flink client/spark-client
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Punit-tests -pl $(JOB1_MODULES),hudi-client/hudi-spark-client
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- task: Maven@4
displayName: FT common flink
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB1_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- script: |
grep "testcase" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100
displayName: Top 100 long-running testcases
- job: UT_FT_2
displayName: FT client/spark-client
timeoutInMinutes: '150'
steps:
- task: Maven@4
displayName: maven install
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean install'
options: $(MVN_OPTS_INSTALL)
publishJUnitResults: false
jdkVersionOption: '1.8'
- task: Maven@4
displayName: FT client/spark-client
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB2_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- script: |
grep "testcase" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100
displayName: Top 100 long-running testcases
- job: UT_FT_3
displayName: UT FT spark-datasource
timeoutInMinutes: '150'
steps:
- task: Maven@4
displayName: maven install
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean install'
options: $(MVN_OPTS_INSTALL)
publishJUnitResults: false
jdkVersionOption: '1.8'
- task: Maven@4
displayName: UT spark-datasource
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Punit-tests -pl $(JOB3_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- task: Maven@4
displayName: FT spark-datasource
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB3_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- script: |
grep "testcase" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100
displayName: Top 100 long-running testcases
- job: UT_FT_4
displayName: UT FT other modules
timeoutInMinutes: '150'
Expand All @@ -191,51 +85,6 @@ stages:
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- task: Maven@4
displayName: FT other modules
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB4_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- script: |
grep "testcase" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100
displayName: Top 100 long-running testcases
- job: IT
displayName: IT modules
timeoutInMinutes: '150'
steps:
- task: Maven@4
displayName: maven install
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean install'
options: $(MVN_OPTS_INSTALL) -Pintegration-tests
publishJUnitResults: false
jdkVersionOption: '1.8'
- task: Maven@4
displayName: UT integ-test
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Pintegration-tests -DskipUTs=false -DskipITs=true -pl hudi-integ-test
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- task: AzureCLI@2
displayName: Prepare for IT
inputs:
azureSubscription: apachehudici-service-connection
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
echo 'Downloading $(SPARK_ARCHIVE)'
az storage blob download -c ci-caches -n $(SPARK_ARCHIVE).tgz -f $(Pipeline.Workspace)/$(SPARK_ARCHIVE).tgz --account-name apachehudici
tar -xvf $(Pipeline.Workspace)/$(SPARK_ARCHIVE).tgz -C $(Pipeline.Workspace)/
mkdir /tmp/spark-events/
- script: |
export SPARK_HOME=$(Pipeline.Workspace)/$(SPARK_ARCHIVE)
mvn $(MVN_OPTS_TEST) -Pintegration-tests verify
displayName: IT
displayName: Top 100 long-running testcases
10 changes: 10 additions & 0 deletions doap_HUDI.rdf
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,16 @@
<created>2022-08-16</created>
<revision>0.12.0</revision>
</Version>
<Version>
<name>Apache Hudi 0.12.1</name>
<created>2022-10-18</created>
<revision>0.12.1</revision>
</Version>
<Version>
<name>Apache Hudi 0.12.2</name>
<created>2022-12-28</created>
<revision>0.12.2</revision>
</Version>
</release>
<repository>
<GitRepository>
Expand Down
2 changes: 1 addition & 1 deletion docker/demo/config/test-suite/compact-test.properties
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ hoodie.bulkinsert.shuffle.parallelism=100
hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
hoodie.deltastreamer.source.test.max_unique_records=100000000
hoodie.embed.timeline.server=false

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector
Expand Down
2 changes: 1 addition & 1 deletion docker/demo/config/test-suite/multi-writer-1.properties
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ hoodie.metadata.enable=false
hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
hoodie.deltastreamer.source.test.max_unique_records=100000000
hoodie.embed.timeline.server=false

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector
Expand Down
2 changes: 1 addition & 1 deletion docker/demo/config/test-suite/multi-writer-2.properties
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ hoodie.metadata.enable=false
hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
hoodie.deltastreamer.source.test.max_unique_records=100000000
hoodie.embed.timeline.server=false

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ hoodie.metadata.enable=false
hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
hoodie.deltastreamer.source.test.max_unique_records=100000000
hoodie.embed.timeline.server=false

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ hoodie.metadata.enable=false
hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
hoodie.deltastreamer.source.test.max_unique_records=100000000
hoodie.embed.timeline.server=false

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ hoodie.metadata.enable=false
hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
hoodie.deltastreamer.source.test.max_unique_records=100000000
hoodie.embed.timeline.server=false

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ hoodie.metadata.enable=false
hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
hoodie.deltastreamer.source.test.max_unique_records=100000000
hoodie.embed.timeline.server=false

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ hoodie.bulkinsert.shuffle.parallelism=100
hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
hoodie.deltastreamer.source.test.max_unique_records=100000000
hoodie.embed.timeline.server=false

hoodie.deltastreamer.source.input.selector=org.apache.hudi.integ.testsuite.helpers.DFSTestSuitePathSelector

hoodie.datasource.hive_sync.skip_ro_suffix=true
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ hoodie.keep.min.commits=12
hoodie.keep.max.commits=14

hoodie.compact.inline=true
hoodie.embed.timeline.server=false


hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ hoodie.delete.shuffle.parallelism=25
hoodie.cleaner.commits.retained=8
hoodie.keep.min.commits=12
hoodie.keep.max.commits=14
hoodie.embed.timeline.server=false


hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ hoodie.bulkinsert.shuffle.parallelism=25
hoodie.delete.shuffle.parallelism=25

hoodie.compact.inline=true
hoodie.embed.timeline.server=false


hoodie.cleaner.commits.retained=8
hoodie.keep.min.commits=12
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ hoodie.delete.shuffle.parallelism=25
hoodie.cleaner.commits.retained=8
hoodie.keep.min.commits=12
hoodie.keep.max.commits=14
hoodie.embed.timeline.server=false


hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ hoodie.upsert.shuffle.parallelism=25
hoodie.bulkinsert.shuffle.parallelism=25
hoodie.delete.shuffle.parallelism=25

hoodie.embed.timeline.server=false


hoodie.compact.inline=true
hoodie.deltastreamer.source.test.num_partitions=100
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ hoodie.keep.max.commits=14

hoodie.compact.inline=true
hoodie.metadata.enable=true
hoodie.embed.timeline.server=false


hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ hoodie.cleaner.commits.retained=8
hoodie.keep.min.commits=12
hoodie.keep.max.commits=14

hoodie.embed.timeline.server=false

hoodie.metadata.enable=true

hoodie.deltastreamer.source.test.num_partitions=100
Expand Down
2 changes: 1 addition & 1 deletion docker/demo/config/test-suite/test-clustering.properties
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ hoodie.upsert.shuffle.parallelism=25
hoodie.bulkinsert.shuffle.parallelism=25
hoodie.delete.shuffle.parallelism=25

hoodie.embed.timeline.server=false


hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ hoodie.delete.shuffle.parallelism=25

hoodie.metadata.enable=false
hoodie.compact.inline=true
hoodie.embed.timeline.server=false


hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ hoodie.cleaner.commits.retained=8
hoodie.keep.min.commits=12
hoodie.keep.max.commits=14

hoodie.embed.timeline.server=false

hoodie.metadata.enable=true
hoodie.compact.inline=true

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ hoodie.keep.min.commits=12
hoodie.keep.max.commits=14

hoodie.metadata.enable=true
hoodie.embed.timeline.server=false


hoodie.deltastreamer.source.test.num_partitions=100
hoodie.deltastreamer.source.test.datagen.use_rocksdb_for_storing_existing_keys=false
Expand Down
Loading