Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
540d6da
Test on jenkins
wangyum Aug 14, 2019
6821aa5
Temporary way for testing JDK 11 on jenkins
wangyum Aug 14, 2019
e6508c0
Try to manually copy environment variables from the parent in run-tes…
HyukjinKwon Aug 14, 2019
2ebdcb9
Revert "Temporary way for testing JDK 11 on jenkins" (#23)
HyukjinKwon Aug 14, 2019
0085ad7
Try shell=True with explicit environment variables (#24)
HyukjinKwon Aug 14, 2019
1c1143f
Revert "Try shell=True with explicit environment variables (#24)"
HyukjinKwon Aug 14, 2019
8435110
Revert " Revert "Temporary way for testing JDK 11 on jenkins" (#23)"
HyukjinKwon Aug 14, 2019
77a70ae
Revert "Try to manually copy environment variables from the parent in…
HyukjinKwon Aug 14, 2019
6fdf309
Check if we need to set PATH for JDK11
wangyum Aug 14, 2019
b984414
Add JAVA_HOME into PATH as well
HyukjinKwon Aug 14, 2019
593a154
fix
wangyum Aug 14, 2019
8b04e78
Revert "fix"
wangyum Aug 14, 2019
59554f9
set java.version to 11
wangyum Aug 14, 2019
17285a6
Update
wangyum Aug 14, 2019
9254dfb
Fix: unresolved dependency: org.apache.hive#hive-metastore;2.3.6: not…
wangyum Aug 15, 2019
0ac0b30
Revert java.version to 1.8
wangyum Aug 15, 2019
500b4a7
Merge remote-tracking branch 'upstream/master' into test-on-jenkins
wangyum Aug 16, 2019
0caa93f
Update deps
wangyum Aug 16, 2019
24bb028
Update deps2
wangyum Aug 16, 2019
46322df
Test SPARK-28765 on JDK 11
wangyum Aug 17, 2019
3856828
Merge remote-tracking branch 'upstream/master' into test-on-jenkins
wangyum Aug 18, 2019
9defec2
Test Hive 2.3.6 on JDK 8
wangyum Aug 18, 2019
a9cbd7f
Merge remote-tracking branch 'upstream/master' into test-on-jenkins
wangyum Aug 22, 2019
ff4783c
Hive 2.3.6 vote passes
wangyum Aug 22, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/building-spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,12 +83,12 @@ Example:

To enable Hive integration for Spark SQL along with its JDBC server and CLI,
add the `-Phive` and `Phive-thriftserver` profiles to your existing build options.
By default, Spark will use Hive 1.2.1 with the `hadoop-2.7` profile, and Hive 2.3.5 with the `hadoop-3.2` profile.
By default, Spark will use Hive 1.2.1 with the `hadoop-2.7` profile, and Hive 2.3.6 with the `hadoop-3.2` profile.

# With Hive 1.2.1 support
./build/mvn -Pyarn -Phive -Phive-thriftserver -DskipTests clean package

# With Hive 2.3.5 support
# With Hive 2.3.6 support
./build/mvn -Pyarn -Phive -Phive-thriftserver -Phadoop-3.2 -DskipTests clean package

## Packaging without Hadoop Dependencies for YARN
Expand Down
2 changes: 1 addition & 1 deletion docs/sql-data-sources-hive-tables.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ The following options can be used to configure the version of Hive that is used
<td><code>1.2.1</code></td>
<td>
Version of the Hive metastore. Available
options are <code>0.12.0</code> through <code>2.3.5</code> and <code>3.0.0</code> through <code>3.1.1</code>.
options are <code>0.12.0</code> through <code>2.3.6</code> and <code>3.0.0</code> through <code>3.1.1</code>.
</td>
</tr>
<tr>
Expand Down
2 changes: 1 addition & 1 deletion docs/sql-migration-guide-hive-compatibility.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ license: |
Spark SQL is designed to be compatible with the Hive Metastore, SerDes and UDFs.
Currently, Hive SerDes and UDFs are based on Hive 1.2.1,
and Spark SQL can be connected to different versions of Hive Metastore
(from 0.12.0 to 2.3.5 and 3.0.0 to 3.1.1. Also see [Interacting with Different Versions of Hive Metastore](sql-data-sources-hive-tables.html#interacting-with-different-versions-of-hive-metastore)).
(from 0.12.0 to 2.3.6 and 3.0.0 to 3.1.1. Also see [Interacting with Different Versions of Hive Metastore](sql-data-sources-hive-tables.html#interacting-with-different-versions-of-hive-metastore)).

#### Deploying in Existing Hive Warehouses

Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@
<hive.classifier></hive.classifier>
<!-- Version used in Maven Hive dependency -->
<hive.version>1.2.1.spark2</hive.version>
<hive23.version>2.3.5</hive23.version>
<hive23.version>2.3.6</hive23.version>
<!-- Version used for internal directory structure -->
<hive.version.short>1.2.1</hive.version.short>
<!-- note that this should be compatible with Kafka brokers version 0.10 and up -->
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@ import org.apache.hadoop.hive.cli.{CliDriver, CliSessionState, OptionsProcessor}
import org.apache.hadoop.hive.common.HiveInterruptUtils
import org.apache.hadoop.hive.conf.HiveConf
import org.apache.hadoop.hive.ql.Driver
import org.apache.hadoop.hive.ql.exec.Utilities
import org.apache.hadoop.hive.ql.processors._
import org.apache.hadoop.hive.ql.session.SessionState
import org.apache.hadoop.security.{Credentials, UserGroupInformation}
Expand Down Expand Up @@ -143,7 +142,7 @@ private[hive] object SparkSQLCLIDriver extends Logging {
var loader = conf.getClassLoader
val auxJars = HiveConf.getVar(conf, HiveConf.ConfVars.HIVEAUXJARS)
if (StringUtils.isNotBlank(auxJars)) {
loader = Utilities.addToClassPath(loader, StringUtils.split(auxJars, ","))
loader = ThriftserverShimUtils.addToClassPath(loader, StringUtils.split(auxJars, ","))
}
conf.setClassLoader(loader)
Thread.currentThread().setContextClassLoader(loader)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -537,7 +537,7 @@ class HiveThriftBinaryServerSuite extends HiveThriftJdbcTest {
}

if (HiveUtils.isHive23) {
assert(conf.get(HiveUtils.FAKE_HIVE_VERSION.key) === Some("2.3.5"))
assert(conf.get(HiveUtils.FAKE_HIVE_VERSION.key) === Some("2.3.6"))
} else {
assert(conf.get(HiveUtils.FAKE_HIVE_VERSION.key) === Some("1.2.1"))
}
Expand All @@ -554,7 +554,7 @@ class HiveThriftBinaryServerSuite extends HiveThriftJdbcTest {
}

if (HiveUtils.isHive23) {
assert(conf.get(HiveUtils.FAKE_HIVE_VERSION.key) === Some("2.3.5"))
assert(conf.get(HiveUtils.FAKE_HIVE_VERSION.key) === Some("2.3.6"))
} else {
assert(conf.get(HiveUtils.FAKE_HIVE_VERSION.key) === Some("1.2.1"))
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
package org.apache.spark.sql.hive.thriftserver

import org.apache.commons.logging.LogFactory
import org.apache.hadoop.hive.ql.exec.Utilities
import org.apache.hadoop.hive.ql.session.SessionState
import org.apache.hive.service.cli.{RowSet, RowSetFactory, TableSchema, Type}
import org.apache.hive.service.cli.thrift.TProtocolVersion._
Expand Down Expand Up @@ -50,6 +51,12 @@ private[thriftserver] object ThriftserverShimUtils {

private[thriftserver] def toJavaSQLType(s: String): Int = Type.getType(s).toJavaSQLType

private[thriftserver] def addToClassPath(
loader: ClassLoader,
auxJars: Array[String]): ClassLoader = {
Utilities.addToClassPath(loader, auxJars)
}

private[thriftserver] val testedProtocolVersions = Seq(
HIVE_CLI_SERVICE_PROTOCOL_V1,
HIVE_CLI_SERVICE_PROTOCOL_V2,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,11 @@

package org.apache.spark.sql.hive.thriftserver
Copy link
Member

@dongjoon-hyun dongjoon-hyun Aug 24, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

During JDK11 testing and review, we has been skipped renaming in order to focus JDK11 related stuff by minimizing PR diff. We may need to rename this src file directory v2.3.5 to v2.3.6 again for consistency later. If the test pass, I'd like to merge this AS-IS PR first.

cc @gatorsmile , @srowen


import java.security.AccessController

import scala.collection.JavaConverters._

import org.apache.hadoop.hive.ql.exec.AddToClassPathAction
import org.apache.hadoop.hive.ql.session.SessionState
import org.apache.hadoop.hive.serde2.thrift.Type
import org.apache.hive.service.cli.{RowSet, RowSetFactory, TableSchema}
Expand Down Expand Up @@ -51,6 +56,13 @@ private[thriftserver] object ThriftserverShimUtils {

private[thriftserver] def toJavaSQLType(s: String): Int = Type.getType(s).toJavaSQLType

private[thriftserver] def addToClassPath(
loader: ClassLoader,
auxJars: Array[String]): ClassLoader = {
val addAction = new AddToClassPathAction(loader, auxJars.toList.asJava)
AccessController.doPrivileged(addAction)
}

private[thriftserver] val testedProtocolVersions = Seq(
HIVE_CLI_SERVICE_PROTOCOL_V1,
HIVE_CLI_SERVICE_PROTOCOL_V2,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ private[spark] object HiveUtils extends Logging {

val HIVE_METASTORE_VERSION = buildConf("spark.sql.hive.metastore.version")
.doc("Version of the Hive metastore. Available options are " +
"<code>0.12.0</code> through <code>2.3.5</code> and " +
"<code>0.12.0</code> through <code>2.3.6</code> and " +
"<code>3.0.0</code> through <code>3.1.1</code>.")
.stringConf
.createWithDefault(builtinHiveVersion)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ private[hive] object IsolatedClientLoader extends Logging {
case "2.0" | "2.0.0" | "2.0.1" => hive.v2_0
case "2.1" | "2.1.0" | "2.1.1" => hive.v2_1
case "2.2" | "2.2.0" => hive.v2_2
case "2.3" | "2.3.0" | "2.3.1" | "2.3.2" | "2.3.3" | "2.3.4" | "2.3.5" => hive.v2_3
case "2.3" | "2.3.0" | "2.3.1" | "2.3.2" | "2.3.3" | "2.3.4" | "2.3.5" | "2.3.6" => hive.v2_3
case "3.0" | "3.0.0" => hive.v3_0
case "3.1" | "3.1.0" | "3.1.1" => hive.v3_1
case version =>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ package object client {

// Since HIVE-14496, Hive materialized view need calcite-core.
// For spark, only VersionsSuite currently creates a hive materialized view for testing.
case object v2_3 extends HiveVersion("2.3.5",
case object v2_3 extends HiveVersion("2.3.6",
exclusions = Seq("org.apache.calcite:calcite-druid",
"org.apache.calcite.avatica:avatica",
"org.apache.curator:*",
Expand Down