You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to build the HBase Spark JAR (for EMR version 7.1.0) using the following command, mvn --projects hbase-spark --also-make -Dspark.version=3.5.0 -Dscala.version=2.12.17 -Dhadoop-three.version=3.3.6 -Dscala.binary.version=2.12 -Dhbase.version=2.4.17 clean package
Two of the tests (i.e., org.apache.hadoop.hbase.spark.TestJavaHBaseContext and org.apache.hadoop.hbase.spark.TestJavaHBaseContextForLargeRows) fail with the same exception,
org.apache.hadoop.hbase.spark.TestJavaHBaseContextForLargeRows Time elapsed: 36.854 s <<< ERROR!
java.io.IOException: Shutting down
at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:256)
at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:109)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1131)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1094)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1058)
at org.apache.hadoop.hbase.spark.TestJavaHBaseContext.init(TestJavaHBaseContext.java:107)
at org.apache.hadoop.hbase.spark.TestJavaHBaseContextForLargeRows.setUpBeforeClass(TestJavaHBaseContextForLargeRows.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.apache.hadoop.hbase.SystemExitRule$1.evaluate(SystemExitRule.java:39)
at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299)
at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Master not active after 30000ms
at org.apache.hadoop.hbase.util.JVMClusterUtil.waitForEvent(JVMClusterUtil.java:221)
at org.apache.hadoop.hbase.util.JVMClusterUtil.startup(JVMClusterUtil.java:177)
at org.apache.hadoop.hbase.LocalHBaseCluster.startup(LocalHBaseCluster.java:407)
at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:249)
... 21 more
I bypassed the test failures by skipping them using -DskipTests=true, after which the build succeeded. However, I am unable to use the resulting JAR to connect to HBase using Spark. The table creation process using Spark fails with the following error, java.lang.NoClassDefFoundError: com/google/protobuf/RpcChannel
Any advise would be very helpful.
P.S. Everything works seamlessly (both the tests as well as establishing a connection to HBase from Spark) with an older EMR version (i.e., 6.7.0), which uses older versions of HBase and Spark, mvn --projects hbase-spark --also-make -Dspark.version=3.1.2 -Dscala.version=2.12.10 -Dhadoop-three.version=3.2.1 -Dscala.binary.version=2.12 -Dhbase.version=2.4.4 clean package
I am currently working on upgrading our EMR versions to leverage newer more-efficient instances, for which I need to upgrade the underlying HBase connector JAR.
The text was updated successfully, but these errors were encountered:
I am trying to build the HBase Spark JAR (for EMR version
7.1.0
) using the following command,mvn --projects hbase-spark --also-make -Dspark.version=3.5.0 -Dscala.version=2.12.17 -Dhadoop-three.version=3.3.6 -Dscala.binary.version=2.12 -Dhbase.version=2.4.17 clean package
Two of the tests (i.e.,
org.apache.hadoop.hbase.spark.TestJavaHBaseContext
andorg.apache.hadoop.hbase.spark.TestJavaHBaseContextForLargeRows
) fail with the same exception,I bypassed the test failures by skipping them using
-DskipTests=true
, after which the build succeeded. However, I am unable to use the resulting JAR to connect to HBase using Spark. The table creation process using Spark fails with the following error,java.lang.NoClassDefFoundError: com/google/protobuf/RpcChannel
Any advise would be very helpful.
P.S. Everything works seamlessly (both the tests as well as establishing a connection to HBase from Spark) with an older EMR version (i.e.,
6.7.0
), which uses older versions of HBase and Spark,mvn --projects hbase-spark --also-make -Dspark.version=3.1.2 -Dscala.version=2.12.10 -Dhadoop-three.version=3.2.1 -Dscala.binary.version=2.12 -Dhbase.version=2.4.4 clean package
I am currently working on upgrading our EMR versions to leverage newer more-efficient instances, for which I need to upgrade the underlying HBase connector JAR.
The text was updated successfully, but these errors were encountered: