-
Notifications
You must be signed in to change notification settings - Fork 2.5k
[HUDI-7071] Throw exceptions when clustering/index job fail #10050
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -172,18 +172,12 @@ public static void main(String[] args) { | |
| throw new HoodieException("Fail to run compaction for " + cfg.tableName + ", return code: " + 1); | ||
| } | ||
| final JavaSparkContext jsc = UtilHelpers.buildSparkContext("compactor-" + cfg.tableName, cfg.sparkMaster, cfg.sparkMemory); | ||
| int ret = 0; | ||
| try { | ||
| ret = new HoodieCompactor(jsc, cfg).compact(cfg.retry); | ||
| } catch (Throwable throwable) { | ||
| throw new HoodieException("Fail to run compaction for " + cfg.tableName + ", return code: " + ret, throwable); | ||
| } finally { | ||
| jsc.stop(); | ||
| } | ||
|
|
||
| int ret = new HoodieCompactor(jsc, cfg).compact(cfg.retry); | ||
| if (ret != 0) { | ||
| throw new HoodieException("Fail to run compaction for " + cfg.tableName + ", return code: " + ret); | ||
| } | ||
| LOG.info("Success to run compaction for " + cfg.tableName); | ||
| jsc.stop(); | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Could you help me understand why remove try-catch block here?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. L175 will calls |
||
| } | ||
|
|
||
| public int compact(int retry) { | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -149,19 +149,18 @@ public static void main(String[] args) { | |
|
|
||
| if (cfg.help || args.length == 0) { | ||
| cmd.usage(); | ||
| System.exit(1); | ||
| throw new HoodieException("Indexing failed for basePath : " + cfg.basePath); | ||
| } | ||
|
|
||
| final JavaSparkContext jsc = UtilHelpers.buildSparkContext("indexing-" + cfg.tableName, cfg.sparkMaster, cfg.sparkMemory); | ||
| HoodieIndexer indexer = new HoodieIndexer(jsc, cfg); | ||
| int result = indexer.start(cfg.retry); | ||
| String resultMsg = String.format("Indexing with basePath: %s, tableName: %s, runningMode: %s", | ||
| cfg.basePath, cfg.tableName, cfg.runningMode); | ||
| if (result == -1) { | ||
| LOG.error(resultMsg + " failed"); | ||
| } else { | ||
| LOG.info(resultMsg + " success"); | ||
| if (result != 0) { | ||
| throw new HoodieException(resultMsg + " failed"); | ||
| } | ||
| LOG.info(resultMsg + " success"); | ||
| jsc.stop(); | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Same here, I think we can add a try catch block here to make sure jsc exit gracefully
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Looks like
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes. |
||
| } | ||
|
|
||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was no try-catch block in HoodieClusteringJob originally. If cluster throws HoodieException, the job returns -1 and jsc stops normally