Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@
import org.apache.hudi.common.util.Option;
import org.apache.hudi.common.util.StringUtils;
import org.apache.hudi.config.HoodieCleanConfig;
import org.apache.hudi.exception.HoodieException;
import org.apache.hudi.table.HoodieSparkTable;

import com.beust.jcommander.JCommander;
Expand Down Expand Up @@ -146,19 +147,17 @@ public static void main(String[] args) {

if (cfg.help || args.length == 0) {
cmd.usage();
System.exit(1);
throw new HoodieException("Clustering failed for basePath: " + cfg.basePath);
}

final JavaSparkContext jsc = UtilHelpers.buildSparkContext("clustering-" + cfg.tableName, cfg.sparkMaster, cfg.sparkMemory);
HoodieClusteringJob clusteringJob = new HoodieClusteringJob(jsc, cfg);
int result = clusteringJob.cluster(cfg.retry);
int result = new HoodieClusteringJob(jsc, cfg).cluster(cfg.retry);
String resultMsg = String.format("Clustering with basePath: %s, tableName: %s, runningMode: %s",
cfg.basePath, cfg.tableName, cfg.runningMode);
if (result == -1) {
LOG.error(resultMsg + " failed");
} else {
LOG.info(resultMsg + " success");
if (result != 0) {
throw new HoodieException(resultMsg + " failed");
}
LOG.info(resultMsg + " success");
jsc.stop();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There was no try-catch block in HoodieClusteringJob originally. If cluster throws HoodieException, the job returns -1 and jsc stops normally

}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -172,18 +172,12 @@ public static void main(String[] args) {
throw new HoodieException("Fail to run compaction for " + cfg.tableName + ", return code: " + 1);
}
final JavaSparkContext jsc = UtilHelpers.buildSparkContext("compactor-" + cfg.tableName, cfg.sparkMaster, cfg.sparkMemory);
int ret = 0;
try {
ret = new HoodieCompactor(jsc, cfg).compact(cfg.retry);
} catch (Throwable throwable) {
throw new HoodieException("Fail to run compaction for " + cfg.tableName + ", return code: " + ret, throwable);
} finally {
jsc.stop();
}

int ret = new HoodieCompactor(jsc, cfg).compact(cfg.retry);
if (ret != 0) {
throw new HoodieException("Fail to run compaction for " + cfg.tableName + ", return code: " + ret);
}
LOG.info("Success to run compaction for " + cfg.tableName);
jsc.stop();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you help me understand why remove try-catch block here?
If L175 failed and threw an exception then L180 would not be executed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

L175 will calls UtilHelpers.retry internal, which has try-catch block. This try-catch block is unreachable.

}

public int compact(int retry) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -149,19 +149,18 @@ public static void main(String[] args) {

if (cfg.help || args.length == 0) {
cmd.usage();
System.exit(1);
throw new HoodieException("Indexing failed for basePath : " + cfg.basePath);
}

final JavaSparkContext jsc = UtilHelpers.buildSparkContext("indexing-" + cfg.tableName, cfg.sparkMaster, cfg.sparkMemory);
HoodieIndexer indexer = new HoodieIndexer(jsc, cfg);
int result = indexer.start(cfg.retry);
String resultMsg = String.format("Indexing with basePath: %s, tableName: %s, runningMode: %s",
cfg.basePath, cfg.tableName, cfg.runningMode);
if (result == -1) {
LOG.error(resultMsg + " failed");
} else {
LOG.info(resultMsg + " success");
if (result != 0) {
throw new HoodieException(resultMsg + " failed");
}
LOG.info(resultMsg + " success");
jsc.stop();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, I think we can add a try catch block here to make sure jsc exit gracefully

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like HoodieIndexer also uses UtilHelpers.retry

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. HoodieCompactor/HoodieClusteringJob/HoodieIndexer use UtilHelpers.retry, so it's no need to add try catch block in main method.

}

Expand Down