Skip to content
This repository was archived by the owner on Sep 2, 2024. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
ee97f23
PARQUET-1498: Add instructions to install thrift via homebrew (#595)
xhochy Jan 22, 2019
4b40d96
PARQUET-1502: Convert FIXED_LEN_BYTE_ARRAY to arrow type in logicalTy…
yongyanw Jan 23, 2019
354fcc2
[PARQUET-1506] Migrate maven-thrift-plugin to thrift-maven-plugin (#…
Fokko Jan 24, 2019
f36dd08
[PARQUET-1500] Replace Closeables with try-with-resources (#597)
Fokko Jan 25, 2019
1e62e2e
PARQUET-1503: Remove Ints Utility Class (#598)
belugabehr Jan 25, 2019
d1e9f15
PARQUET-1513: Update HiddenFileFilter to avoid extra startsWith (#606)
belugabehr Jan 27, 2019
00a7a47
PARQUET-1504: Add an option to convert Int96 to Arrow Timestamp (#594)
yongyanw Jan 27, 2019
ddc7747
PARQUET-1509: Note Hive deprecation in README. (#602)
belugabehr Jan 27, 2019
d9a1962
PARQUET-1510: Fix notEq for optional columns with null values. (#603)
rdblue Jan 28, 2019
9d1006f
[PARQUET-1507] Bump Apache Thrift to 0.12.0 (#601)
Fokko Jan 30, 2019
1b103da
PARQUET-1518: Use Jackson2 version 2.9.8 in parquet-cli (#609)
Fokko Jan 31, 2019
51c4cc3
PARQUET-138: Allow merging more restrictive field in less restrictive…
ntrinquier Feb 1, 2019
3537c88
Add javax.annotation-api dependency for JDK >= 9 (#604)
xhochy Feb 5, 2019
82935e6
PARQUET-1470: Inputstream leakage in ParquetFileWriter.appendFile (#611)
Fokko Feb 6, 2019
5bd1265
PARQUET-1514: ParquetFileWriter Records Compressed Bytes instead of U…
belugabehr Feb 6, 2019
714bb45
PARQUET-1505: Use Java 7 NIO StandardCharsets (#599)
belugabehr Feb 7, 2019
6901a20
PARQUET-1480 INT96 to avro not yet implemented error should mention d…
tims Feb 7, 2019
7dcdcdc
PARQUET-1485: Fix Snappy direct memory leak (#581)
Feb 12, 2019
9461845
PARQUET-1527: [parquet-tools] cat command throw java.lang.ClassCastE…
masayuki038 Feb 12, 2019
dcfd53a
PARQUET-1529: Shade fastutil in all modules where used (#617)
gszadovszky Feb 13, 2019
f2c5b9a
Update CHANGES.md for 1.11.0rc4
gszadovszky Feb 14, 2019
22a9f54
[maven-release-plugin] prepare release apache-parquet-1.11.0
gszadovszky Feb 14, 2019
4cc22dd
[maven-release-plugin] prepare for next development iteration
gszadovszky Feb 14, 2019
f799893
PARQUET-1533: TestSnappy() throws OOM exception with Parquet-1485 cha…
gszadovszky Feb 25, 2019
ab42fe5
Revert "PARQUET-1381: Add merge blocks command to parquet-tools (#512…
gszadovszky Feb 25, 2019
892dedb
PARQUET-1531: Page row count limit causes empty pages to be written f…
gszadovszky Mar 13, 2019
acaf9e6
Update CHANGES.md for 1.11.0rc5
gszadovszky Mar 13, 2019
e85dbd3
[maven-release-plugin] prepare release apache-parquet-1.11.0
gszadovszky Mar 13, 2019
ac65040
[maven-release-plugin] prepare for next development iteration
gszadovszky Mar 13, 2019
5acfd32
Revert "Revert "PARQUET-1435: Benchmark filtering column-indexes (#53…
yifeih Mar 13, 2019
f64ff19
Revert "Revert "PARQUET-1414: Limit page size based on maximum row co…
yifeih Mar 13, 2019
c407572
Merge remote-tracking branch 'apache/master'
yifeih Mar 13, 2019
773fd4d
upgrade docker image thrift version
yifeih Mar 13, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 37 additions & 2 deletions CHANGES.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ Release Notes - Parquet - Version 1.11.0
#### Bug

* [PARQUET-1364](https://issues.apache.org/jira/browse/PARQUET-1364) - Column Indexes: Invalid row indexes for pages starting with nulls
* [PARQUET-138](https://issues.apache.org/jira/browse/PARQUET-138) - Parquet should allow a merge between required and optional schemas
* [PARQUET-952](https://issues.apache.org/jira/browse/PARQUET-952) - Avro union with single type fails with 'is not a group'
* [PARQUET-1128](https://issues.apache.org/jira/browse/PARQUET-1128) - \[Java\] Upgrade the Apache Arrow version to 0.8.0 for SchemaConverter
* [PARQUET-1285](https://issues.apache.org/jira/browse/PARQUET-1285) - \[Java\] SchemaConverter should not convert from TimeUnit.SECOND AND TimeUnit.NANOSECOND of Arrow
Expand All @@ -49,18 +50,29 @@ Release Notes - Parquet - Version 1.11.0
* [PARQUET-1456](https://issues.apache.org/jira/browse/PARQUET-1456) - Use page index, ParquetFileReader throw ArrayIndexOutOfBoundsException
* [PARQUET-1460](https://issues.apache.org/jira/browse/PARQUET-1460) - Fix javadoc errors and include javadoc checking in Travis checks
* [PARQUET-1461](https://issues.apache.org/jira/browse/PARQUET-1461) - Third party code does not compile after parquet-mr minor version update
* [PARQUET-1470](https://issues.apache.org/jira/browse/PARQUET-1470) - Inputstream leakage in ParquetFileWriter.appendFile
* [PARQUET-1472](https://issues.apache.org/jira/browse/PARQUET-1472) - Dictionary filter fails on FIXED\_LEN\_BYTE\_ARRAY
* [PARQUET-1475](https://issues.apache.org/jira/browse/PARQUET-1475) - DirectCodecFactory's ParquetCompressionCodecException drops a passed in cause in one constructor
* [PARQUET-1478](https://issues.apache.org/jira/browse/PARQUET-1478) - Can't read spec compliant, 3-level lists via parquet-proto
* [PARQUET-1480](https://issues.apache.org/jira/browse/PARQUET-1480) - INT96 to avro not yet implemented error should mention deprecation
* [PARQUET-1485](https://issues.apache.org/jira/browse/PARQUET-1485) - Snappy Decompressor/Compressor may cause direct memory leak
* [PARQUET-1498](https://issues.apache.org/jira/browse/PARQUET-1498) - \[Java\] Add instructions to install thrift via homebrew
* [PARQUET-1510](https://issues.apache.org/jira/browse/PARQUET-1510) - Dictionary filter skips null values when evaluating not-equals.
* [PARQUET-1514](https://issues.apache.org/jira/browse/PARQUET-1514) - ParquetFileWriter Records Compressed Bytes instead of Uncompressed Bytes
* [PARQUET-1527](https://issues.apache.org/jira/browse/PARQUET-1527) - \[parquet-tools\] cat command throw java.lang.ClassCastException
* [PARQUET-1529](https://issues.apache.org/jira/browse/PARQUET-1529) - Shade fastutil in all modules where used
* [PARQUET-1531](https://issues.apache.org/jira/browse/PARQUET-1531) - Page row count limit causes empty pages to be written from MessageColumnIO
* [PARQUET-1533](https://issues.apache.org/jira/browse/PARQUET-1533) - TestSnappy() throws OOM exception with Parquet-1485 change

#### New Feature

* [PARQUET-1201](https://issues.apache.org/jira/browse/PARQUET-1201) - Column indexes
* [PARQUET-1253](https://issues.apache.org/jira/browse/PARQUET-1253) - Support for new logical type representation
* [PARQUET-1381](https://issues.apache.org/jira/browse/PARQUET-1381) - Add merge blocks command to parquet-tools
* [PARQUET-1388](https://issues.apache.org/jira/browse/PARQUET-1388) - Nanosecond precision time and timestamp - parquet-mr

#### Improvement

* [PARQUET-1280](https://issues.apache.org/jira/browse/PARQUET-1280) - \[parquet-protobuf\] Use maven protoc plugin
* [PARQUET-1321](https://issues.apache.org/jira/browse/PARQUET-1321) - LogicalTypeAnnotation.LogicalTypeAnnotationVisitor#visit methods should have a return value
* [PARQUET-1335](https://issues.apache.org/jira/browse/PARQUET-1335) - Logical type names in parquet-mr are not consistent with parquet-format
* [PARQUET-1336](https://issues.apache.org/jira/browse/PARQUET-1336) - PrimitiveComparator should implements Serializable
Expand All @@ -73,18 +85,41 @@ Release Notes - Parquet - Version 1.11.0
* [PARQUET-1418](https://issues.apache.org/jira/browse/PARQUET-1418) - Run integration tests in Travis
* [PARQUET-1435](https://issues.apache.org/jira/browse/PARQUET-1435) - Benchmark filtering column-indexes
* [PARQUET-1462](https://issues.apache.org/jira/browse/PARQUET-1462) - Allow specifying new development version in prepare-release.sh
* [PARQUET-1466](https://issues.apache.org/jira/browse/PARQUET-1466) - Upgrade to the latest guava 27.0-jre
* [PARQUET-1474](https://issues.apache.org/jira/browse/PARQUET-1474) - Less verbose and lower level logging for missing column/offset indexes
* [PARQUET-1476](https://issues.apache.org/jira/browse/PARQUET-1476) - Don't emit a warning message for files without new logical type
* [PARQUET-1487](https://issues.apache.org/jira/browse/PARQUET-1487) - Do not write original type for timezone-agnostic timestamps
* [PARQUET-1489](https://issues.apache.org/jira/browse/PARQUET-1489) - Insufficient documentation for UserDefinedPredicate.keep(T)

* [PARQUET-1490](https://issues.apache.org/jira/browse/PARQUET-1490) - Add branch-specific Travis steps
* [PARQUET-1492](https://issues.apache.org/jira/browse/PARQUET-1492) - Remove protobuf install in travis build
* [PARQUET-1500](https://issues.apache.org/jira/browse/PARQUET-1500) - Remove the Closables
* [PARQUET-1502](https://issues.apache.org/jira/browse/PARQUET-1502) - Convert FIXED\_LEN\_BYTE\_ARRAY to arrow type in
* [PARQUET-1503](https://issues.apache.org/jira/browse/PARQUET-1503) - Remove Ints Utility Class
* [PARQUET-1504](https://issues.apache.org/jira/browse/PARQUET-1504) - Add an option to convert Parquet Int96 to Arrow Timestamp
* [PARQUET-1505](https://issues.apache.org/jira/browse/PARQUET-1505) - Use Java 7 NIO StandardCharsets
* [PARQUET-1506](https://issues.apache.org/jira/browse/PARQUET-1506) - Migrate from maven-thrift-plugin to thrift-maven-plugin
* [PARQUET-1507](https://issues.apache.org/jira/browse/PARQUET-1507) - Bump Apache Thrift to 0.12.0
* [PARQUET-1509](https://issues.apache.org/jira/browse/PARQUET-1509) - Update Docs for Hive Deprecation
* [PARQUET-1513](https://issues.apache.org/jira/browse/PARQUET-1513) - HiddenFileFilter Streamline
* [PARQUET-1518](https://issues.apache.org/jira/browse/PARQUET-1518) - Bump Jackson2 version of parquet-cli

#### Task

* [PARQUET-968](https://issues.apache.org/jira/browse/PARQUET-968) - Add Hive/Presto support in ProtoParquet
* [PARQUET-1294](https://issues.apache.org/jira/browse/PARQUET-1294) - Update release scripts for the new Apache policy
* [PARQUET-1434](https://issues.apache.org/jira/browse/PARQUET-1434) - Release parquet-mr 1.11.0
* [PARQUET-1436](https://issues.apache.org/jira/browse/PARQUET-1436) - TimestampMicrosStringifier shows wrong microseconds for timestamps before 1970
* [PARQUET-1452](https://issues.apache.org/jira/browse/PARQUET-1452) - Deprecate old logical types API

### Version 1.10.1 ###

Release Notes - Parquet - Version 1.10.1

#### Bug

* [PARQUET-1510](https://issues.apache.org/jira/browse/PARQUET-1510) \- Dictionary filter skips null values when evaluating not-equals.
* [PARQUET-1309](https://issues.apache.org/jira/browse/PARQUET-1309) \- Parquet Java uses incorrect stats and dictionary filter properties

### Version 1.10.0 ###

Release Notes - Parquet - Version 1.10.0
Expand Down
5 changes: 1 addition & 4 deletions FORK.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,5 @@
# Differences to mainline
This repo exists mostly to make releases of parquet-mr more often. The only difference to upstream is as follows:

1. Solution for [PARQUET-686](https://issues.apache.org/jira/browse/PARQUET-686).
2. Temporarily revert [PARQUET-1414](https://issues.apache.org/jira/browse/PARQUET-1414) because it causes Spark to write unreadable empty Parquet pages.
This repo exists mostly to make releases of parquet-mr more often. The only difference to upstream is solution for [PARQUET-686](https://issues.apache.org/jira/browse/PARQUET-686).

The change that we had made that upstream only made in statistics v2 is to change binary comparison to be unsigned and declare all statistics priori to that change as corrupted. This lets us more quickly take advantage of binary statistics and removes burden on user to know whether they should account for signed binary comparison in their values.

Expand Down
17 changes: 13 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,14 +35,21 @@ Parquet-MR uses Maven to build and depends on the thrift compiler (protoc is now
To build and install the thrift compiler, run:

```
wget -nv http://archive.apache.org/dist/thrift/0.9.3/thrift-0.9.3.tar.gz
tar xzf thrift-0.9.3.tar.gz
cd thrift-0.9.3
wget -nv http://archive.apache.org/dist/thrift/0.12.0/thrift-0.12.0.tar.gz
tar xzf thrift-0.12.0.tar.gz
cd thrift-0.12.0
chmod +x ./configure
./configure --disable-gen-erl --disable-gen-hs --without-ruby --without-haskell --without-erlang --without-php --without-nodejs
sudo make install
```

If you're on OSX and use homebrew, you can instead install Thrift 0.12.0 with `brew` and ensure that it comes first in your `PATH`.

```
brew install thrift@0.12.0
export PATH="/usr/local/opt/thrift@0.12.0/bin:$PATH"
```

### Build Parquet with Maven

Once protobuf and thrift are available in your path, you can build the project by running:
Expand All @@ -57,7 +64,7 @@ Parquet is a very active project, and new features are being added quickly. Here


* Type-specific encoding
* Hive integration
* Hive integration (deprecated)
* Pig integration
* Cascading integration
* Crunch integration
Expand Down Expand Up @@ -120,6 +127,8 @@ If the data was stored using Pig, things will "just work". If the data was store

Hive integration is provided via the [parquet-hive](https://github.com/apache/parquet-mr/tree/master/parquet-hive) sub-project.

Hive integration is now deprecated within the Parquet project. It is now maintained by Apache Hive.

## Build

To run the unit tests: `mvn test`
Expand Down
4 changes: 2 additions & 2 deletions dev/docker-images/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -99,10 +99,10 @@ RUN groupadd --gid 3434 circleci \

SHELL ["/bin/bash", "-eux", "-o", "pipefail", "-c"]

RUN THRIFT_URL="http://archive.apache.org/dist/thrift/0.9.3/thrift-0.9.3.tar.gz" \
RUN THRIFT_URL="http://archive.apache.org/dist/thrift/0.12.0/thrift-0.12.0.tar.gz" \
&& curl --silent --show-error --location --fail --retry 3 --output /tmp/thrift.tar.gz $THRIFT_URL \
&& tar -C /tmp -xzvf /tmp/thrift.tar.gz \
&& cd /tmp/thrift-0.9.3 \
&& cd /tmp/thrift-0.12.0 \
&& chmod +x ./configure \
&& ./configure --disable-gen-erl --disable-gen-hs --without-ruby --without-haskell --without-erlang --without-python \
&& make install \
Expand Down
8 changes: 5 additions & 3 deletions dev/travis-before_install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@
# This script gets invoked by .travis.yml in the before_install step
################################################################################

export THIFT_VERSION=0.12.0

set -e
date
sudo apt-get update -qq
Expand All @@ -27,9 +29,9 @@ sudo apt-get install -qq build-essential pv autoconf automake libtool curl make
libevent-dev automake libtool flex bison pkg-config g++ libssl-dev xmlstarlet
date
pwd
wget -nv http://archive.apache.org/dist/thrift/0.9.3/thrift-0.9.3.tar.gz
tar zxf thrift-0.9.3.tar.gz
cd thrift-0.9.3
wget -nv http://archive.apache.org/dist/thrift/${THIFT_VERSION}/thrift-${THIFT_VERSION}.tar.gz
tar zxf thrift-${THIFT_VERSION}.tar.gz
cd thrift-${THIFT_VERSION}
chmod +x ./configure
./configure --disable-gen-erl --disable-gen-hs --without-ruby --without-haskell --without-erlang --without-php --without-nodejs
sudo make install
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -86,10 +86,19 @@
*/
public class SchemaConverter {

// Indicates if Int96 should be converted to Arrow Timestamp
private final boolean convertInt96ToArrowTimestamp;

/**
* For when we'll need this to be configurable
*/
public SchemaConverter() {
this(false);
}

// TODO(PARQUET-1511): pass the parameters in a configuration object
public SchemaConverter(final boolean convertInt96ToArrowTimestamp) {
this.convertInt96ToArrowTimestamp = convertInt96ToArrowTimestamp;
}

/**
Expand Down Expand Up @@ -492,13 +501,26 @@ private String getTimeZone(LogicalTypeAnnotation.TimestampLogicalTypeAnnotation

@Override
public TypeMapping convertINT96(PrimitiveTypeName primitiveTypeName) throws RuntimeException {
// Possibly timestamp
return field(new ArrowType.Binary());
if (convertInt96ToArrowTimestamp) {
return field(new ArrowType.Timestamp(TimeUnit.NANOSECOND, null));
} else {
return field(new ArrowType.Binary());
}
}

@Override
public TypeMapping convertFIXED_LEN_BYTE_ARRAY(PrimitiveTypeName primitiveTypeName) throws RuntimeException {
return field(new ArrowType.Binary());
LogicalTypeAnnotation logicalTypeAnnotation = type.getLogicalTypeAnnotation();
if (logicalTypeAnnotation == null) {
return field(new ArrowType.Binary());
}

return logicalTypeAnnotation.accept(new LogicalTypeAnnotation.LogicalTypeAnnotationVisitor<TypeMapping>() {
@Override
public Optional<TypeMapping> visit(LogicalTypeAnnotation.DecimalLogicalTypeAnnotation decimalLogicalType) {
return of(decimal(decimalLogicalType.getPrecision(), decimalLogicalType.getScale()));
}
}).orElseThrow(() -> new IllegalArgumentException("illegal type " + type));
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@
import static org.apache.parquet.schema.PrimitiveType.PrimitiveTypeName.FLOAT;
import static org.apache.parquet.schema.PrimitiveType.PrimitiveTypeName.INT32;
import static org.apache.parquet.schema.PrimitiveType.PrimitiveTypeName.INT64;
import static org.apache.parquet.schema.PrimitiveType.PrimitiveTypeName.INT96;

import java.io.IOException;
import java.util.List;
Expand Down Expand Up @@ -419,6 +420,47 @@ public void testParquetInt64TimeMicrosToArrow() {
Assert.assertEquals(expected, converter.fromParquet(parquet).getArrowSchema());
}

@Test
public void testParquetFixedBinaryToArrow() {
MessageType parquet = Types.buildMessage()
.addField(Types.optional(FIXED_LEN_BYTE_ARRAY).length(12).named("a")).named("root");
Schema expected = new Schema(asList(
field("a", new ArrowType.Binary())
));
Assert.assertEquals(expected, converter.fromParquet(parquet).getArrowSchema());
}

@Test
public void testParquetFixedBinaryToArrowDecimal() {
MessageType parquet = Types.buildMessage()
.addField(Types.optional(FIXED_LEN_BYTE_ARRAY).length(5).as(DECIMAL).precision(8).scale(2).named("a")).named("root");
Schema expected = new Schema(asList(
field("a", new ArrowType.Decimal(8, 2))
));
Assert.assertEquals(expected, converter.fromParquet(parquet).getArrowSchema());
}

@Test
public void testParquetInt96ToArrowBinary() {
MessageType parquet = Types.buildMessage()
.addField(Types.optional(INT96).named("a")).named("root");
Schema expected = new Schema(asList(
field("a", new ArrowType.Binary())
));
Assert.assertEquals(expected, converter.fromParquet(parquet).getArrowSchema());
}

@Test
public void testParquetInt96ToArrowTimestamp() {
final SchemaConverter converterInt96ToTimestamp = new SchemaConverter(true);
MessageType parquet = Types.buildMessage()
.addField(Types.optional(INT96).named("a")).named("root");
Schema expected = new Schema(asList(
field("a", new ArrowType.Timestamp(TimeUnit.NANOSECOND, null))
));
Assert.assertEquals(expected, converterInt96ToTimestamp.fromParquet(parquet).getArrowSchema());
}

@Test(expected = IllegalStateException.class)
public void testParquetInt64TimeMillisToArrow() {
converter.fromParquet(Types.buildMessage()
Expand Down
4 changes: 4 additions & 0 deletions parquet-avro/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,10 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
</plugin>
</plugins>
</build>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -291,7 +291,7 @@ public Schema convertINT64(PrimitiveTypeName primitiveTypeName) {
}
@Override
public Schema convertINT96(PrimitiveTypeName primitiveTypeName) {
throw new IllegalArgumentException("INT96 not yet implemented.");
throw new IllegalArgumentException("INT96 not implemented and is deprecated");
}
@Override
public Schema convertFLOAT(PrimitiveTypeName primitiveTypeName) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@
*/
package org.apache.parquet.avro;

import com.google.common.base.Charsets;
import com.google.common.collect.ImmutableMap;
import com.google.common.collect.Lists;
import com.google.common.io.Resources;
Expand All @@ -27,6 +26,7 @@
import java.math.BigDecimal;
import java.math.BigInteger;
import java.nio.ByteBuffer;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
Expand Down Expand Up @@ -369,7 +369,7 @@ public void testAll() throws Exception {
.set("mylong", 2L)
.set("myfloat", 3.1f)
.set("mydouble", 4.1)
.set("mybytes", ByteBuffer.wrap("hello".getBytes(Charsets.UTF_8)))
.set("mybytes", ByteBuffer.wrap("hello".getBytes(StandardCharsets.UTF_8)))
.set("mystring", "hello")
.set("mynestedrecord", nestedRecord)
.set("myenum", "a")
Expand Down Expand Up @@ -398,7 +398,7 @@ public void testAll() throws Exception {
assertEquals(2L, nextRecord.get("mylong"));
assertEquals(3.1f, nextRecord.get("myfloat"));
assertEquals(4.1, nextRecord.get("mydouble"));
assertEquals(ByteBuffer.wrap("hello".getBytes(Charsets.UTF_8)), nextRecord.get("mybytes"));
assertEquals(ByteBuffer.wrap("hello".getBytes(StandardCharsets.UTF_8)), nextRecord.get("mybytes"));
assertEquals(str("hello"), nextRecord.get("mystring"));
assertEquals(expectedEnumSymbol, nextRecord.get("myenum"));
assertEquals(nestedRecord, nextRecord.get("mynestedrecord"));
Expand Down Expand Up @@ -567,7 +567,7 @@ public void write(Map<String, Object> record) {
record.put("mylong", 2L);
record.put("myfloat", 3.1f);
record.put("mydouble", 4.1);
record.put("mybytes", ByteBuffer.wrap("hello".getBytes(Charsets.UTF_8)));
record.put("mybytes", ByteBuffer.wrap("hello".getBytes(StandardCharsets.UTF_8)));
record.put("mystring", "hello");
record.put("myenum", "a");
record.put("mynestedint", 1);
Expand Down Expand Up @@ -615,7 +615,7 @@ public void write(Map<String, Object> record) {
assertEquals(2L, nextRecord.get("mylong"));
assertEquals(3.1f, nextRecord.get("myfloat"));
assertEquals(4.1, nextRecord.get("mydouble"));
assertEquals(ByteBuffer.wrap("hello".getBytes(Charsets.UTF_8)), nextRecord.get("mybytes"));
assertEquals(ByteBuffer.wrap("hello".getBytes(StandardCharsets.UTF_8)), nextRecord.get("mybytes"));
assertEquals(str("hello"), nextRecord.get("mystring"));
assertEquals(str("a"), nextRecord.get("myenum")); // enum symbols are unknown
assertEquals(nestedRecord, nextRecord.get("mynestedrecord"));
Expand Down
Loading