Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -185,10 +185,15 @@ protected BaseHiveConnectorTest()
protected static QueryRunner createHiveQueryRunner(Map<String, String> extraProperties, Consumer<QueryRunner> additionalSetup)
throws Exception
{
// Use faster compression codec in tests. TODO remove explicit config when default changes
verify(new HiveConfig().getHiveCompressionCodec() == HiveCompressionOption.GZIP);
String hiveCompressionCodec = HiveCompressionCodec.ZSTD.name();
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not to inline this ?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you mean to use "ZSTD" string literal? i think the current way describes which enum this is referring to


DistributedQueryRunner queryRunner = HiveQueryRunner.builder()
.setExtraProperties(extraProperties)
.setAdditionalSetup(additionalSetup)
.setHiveProperties(ImmutableMap.of(
"hive.compression-codec", hiveCompressionCodec,
"hive.allow-register-partition-procedure", "true",
// Reduce writer sort buffer size to ensure SortingFileWriter gets used
"hive.writer-sort-buffer-size", "1MB",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -248,8 +248,10 @@ public DistributedQueryRunner build()
.put("hive.max-initial-split-size", "10kB") // so that each bucket has multiple splits
.put("hive.max-split-size", "10kB") // so that each bucket has multiple splits
.put("hive.storage-format", "TEXTFILE") // so that there's no minimum split size for the file
.put("hive.compression-codec", "NONE") // so that the file is splittable
.buildOrThrow();
hiveBucketedProperties = new HashMap<>(hiveBucketedProperties);
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

undo?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's intentional. i need to override compression for the second catalog gets registered here

hiveBucketedProperties.put("hive.compression-codec", "NONE"); // so that the file is splittable

queryRunner.createCatalog(HIVE_CATALOG, "hive", hiveProperties);
queryRunner.createCatalog(HIVE_BUCKETED_CATALOG, "hive", hiveBucketedProperties);

Expand Down