Skip to content

Conversation

@Fokko
Copy link
Contributor

@Fokko Fokko commented Nov 3, 2019

Looks like there are issues with older versions of Hive which use Datanucleus 3.x:

java.lang.RuntimeException: Cannot start TestHiveMetastore
	at org.apache.iceberg.hive.TestHiveMetastore.start(TestHiveMetastore.java:69)
	at org.apache.iceberg.hive.HiveMetastoreTest.startMetastore(HiveMetastoreTest.java:41)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:175)
	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:157)
	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:404)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
	at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: javax.jdo.JDOFatalInternalException: The java type java.lang.Long (jdbc-type="", sql-type="") cant be mapped for this datastore. No mapping is available.
NestedThrowables:
org.datanucleus.exceptions.NucleusException: The java type java.lang.Long (jdbc-type="", sql-type="") cant be mapped for this datastore. No mapping is available.
	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:591)
	at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732)
	at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
	at org.apache.hadoop.hive.metastore.ObjectStore.setMetaStoreSchemaVersion(ObjectStore.java:6773)
	at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:6670)
	at org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:6645)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
	at com.sun.proxy.$Proxy16.verifySchema(Unknown Source)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:419)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:412)
	at org.apache.iceberg.hive.TestHiveMetastore.newThriftServer(TestHiveMetastore.java:97)
	at org.apache.iceberg.hive.TestHiveMetastore.start(TestHiveMetastore.java:65)
	... 41 more
Caused by: org.datanucleus.exceptions.NucleusException: The java type java.lang.Long (jdbc-type="", sql-type="") cant be mapped for this datastore. No mapping is available.
	at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.getDatastoreMappingClass(RDBMSMappingManager.java:1215)
	at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.createDatastoreMapping(RDBMSMappingManager.java:1378)
	at org.datanucleus.store.rdbms.table.AbstractClassTable.addDatastoreId(AbstractClassTable.java:392)
	at org.datanucleus.store.rdbms.table.ClassTable.initializePK(ClassTable.java:1087)
	at org.datanucleus.store.rdbms.table.ClassTable.preInitialize(ClassTable.java:247)
	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTable(RDBMSStoreManager.java:3118)
	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTables(RDBMSStoreManager.java:2909)
	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3182)
	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)
	at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
	at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)
	at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)
	at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)
	at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045)
	at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365)
	at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827)
	at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571)
	at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513)
	at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232)
	at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414)
	at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218)
	at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065)
	at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913)
	at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
	at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727)
	... 58 more

We're using them in our testdependencies. https://github.com/apache/incubator-iceberg/blob/master/versions.props#L19-L20
It looks like for Spark, they've disabled the tests:
cenyuhai/spark@58cc0df

Also Spark 2.x Hive is on Datanucleaus 3.x: https://mvnrepository.com/artifact/org.apache.spark/spark-hive_2.12/2.4.4

@Fokko
Copy link
Contributor Author

Fokko commented Nov 3, 2019

For local testing I was thinking of building a Docker container which spins up a Hive metastore. This will also enable you to actually create tables locally. We could use the same docker container for running the hive tests, so we don't have to run an in-process hive metastore and this would solve the problem as well. WDYT @rdblue ?

@Fokko Fokko closed this Nov 3, 2019
@Fokko Fokko deleted the java-8-check branch November 3, 2019 19:44
@Fokko
Copy link
Contributor Author

Fokko commented Nov 3, 2019

Closed this one in favor of #577

szehon-ho pushed a commit to szehon-ho/iceberg that referenced this pull request Aug 3, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Oct 31, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 1, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 3, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 7, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 8, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 9, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 14, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 14, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 17, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 18, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Nov 30, 2022
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
haizhou-zhao pushed a commit to haizhou-zhao/iceberg that referenced this pull request Feb 23, 2023
…tion (apache#602)

AWS Kms based Kms Client Implementation for envelope encryption
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants