Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scala / sbt java.lang.ClassNotFoundException: com.ullink.slack.simpleslackapi.listeners.SlackMessagePostedListener #256

Open
CStrue opened this issue Sep 28, 2018 · 0 comments

Comments

@CStrue
Copy link

CStrue commented Sep 28, 2018

Hi,

i want to include slack api into spark streaming project in scala / sbt
When i run the program i get a class not found exception - i guess there are dependency issues?

error:

Exception in thread "main" java.lang.NoClassDefFoundError: com/ullink/slack/simpleslackapi/listeners/SlackMessagePostedListener at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:175) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.ullink.slack.simpleslackapi.listeners.SlackMessagePostedListener at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 17 more

build.sbt looks like:

`name := "slackKnowledgeCollector"
version := "0.1"
scalaVersion := "2.11.8"

val sparkVersion = "2.3.1"

resolvers ++= Seq(
"Hortonworks" at "http://repo.hortonworks.com/content/repositories/releases/",
"Hortonworks Groups" at "http://repo.hortonworks.com/content/groups/public/",
"Apache Snapshots" at "https://repository.apache.org/content/repositories/releases/",
"Maven Central" at "http://central.maven.org/maven2/"
)

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % Provided,
"org.apache.spark" %% "spark-sql" % sparkVersion % Provided,
"org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion,
//"org.apache.spark" %% "spark-streaming" % sparkVersion % Provided,
//"org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion % Provided,
"org.apache.kafka" %% "kafka" % "0.10.0.2.5.3.0-37",
"com.ullink.slack" % "simpleslackapi" % "1.2.0" excludeAll(
ExclusionRule(organization = "org.apache.httpcomponents"),
ExclusionRule(organization = "com.google.guava"),
ExclusionRule(organization = "ch.qos.logback"),
ExclusionRule(organization = "org.slf4j")
)
)

assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false, cacheOutput = false)
test in assembly := {}
assemblyMergeStrategy in assembly := {
case PathList("org", "apache", "spark", "unused", "UnusedStubClass.class") => MergeStrategy.discard
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}

scalacOptions += "-target:jvm-1.8"`

Anyone having an idea how to solve this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant