火花kafka流式传输错误-“ java.lang.NoClassDefFoundError:org / apache / spark / streaming / kafka / KafkaUtils

凯蒂

我正在编写一个简单的卡夫卡-Eclipse中的火花流代码,以使用火花流来消耗来自卡夫卡经纪人的消息。下面是代码,当我尝试从eclipse运行代码时,我收到错误消息。

我还确保了依赖罐子到位,请帮助摆脱这个错误

对象spark_kafka_streaming {

def main(args: Array[String]) {

val conf = new SparkConf()
  .setAppName("The swankiest Spark app ever")
  .setMaster("local[*]")

val ssc = new StreamingContext(conf, Seconds(60))
ssc.checkpoint("C:\\keerthi\\software\\eclipse-jee-mars-2-win32-  x86_64\\eclipse")

    println("Parameters:" + "zkorum:" + "group:" + "topicMap:"+"number of threads:")

val zk = "xxxxxxxx:2181"
val group = "test-consumer-group"
val topics = "my-replicated-topic"
val numThreads = 2

val topicMap =  topics.split(",").map((_,numThreads.toInt)).toMap

val lines = KafkaUtils.createStream(ssc,zk,group,topicMap).map(_._2)
val words = lines.flatMap(_.split(" "))
val wordCounts = words.map(x => (x,1L)).count()

println("wordCounts:"+wordCounts)

//wordCounts.print
  }
}  

例外:

线程“主”中的异常java.lang.NoClassDefFoundError:org.firststream.spark_kakfa.spark_kafka_streaming $ .main(spark_kafka_streaming.scala:30)上的org / apache / spark / streaming / kafka / KafkaUtils $位于org.firststream.spark_kafka。 .main(spark_kafka_streaming.scala)原因:java.lang.ClassNotFoundException:org.apache.spark.streaming.kafka.KafkaUtils $ at java.net.URLClassLoader.findClass(未知源),java.lang.ClassLoader.loadClass(未知)来源)位于sun.misc.Launcher $ AppClassLoader.loadClass(未知来源),位于java.lang.ClassLoader.loadClass(未知来源)... 2另外

依存关系:

   <dependency>
      <groupId>org.apache.kafka</groupId>
    <artifactId>kafka_2.10</artifactId>
    <version>0.8.1.1</version>
    <scope>compile</scope>
  <exclusions>
    <exclusion>
      <artifactId>jmxri</artifactId>
      <groupId>com.sun.jmx</groupId>
    </exclusion>
    <exclusion>
      <artifactId>jms</artifactId>
      <groupId>javax.jms</groupId>
    </exclusion>
    <exclusion>
      <artifactId>jmxtools</artifactId>
      <groupId>com.sun.jdmk</groupId>
    </exclusion>
  </exclusions>
 </dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2.0</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka_2.10</artifactId>
    <version>1.2.0</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>      
<version>1.2.0</version>
</dependency>
凯蒂

我评论了以下依赖项。通过单击buildpath-> configure build path-> External Jars,直接将spark-streaming-kafka_2.10和kafka_2.10-0.8.1.1 jar添加到eclpise中的引用库。这解决了问题。

<!-- dependency>
  <groupId>org.apache.kafka</groupId>
  <artifactId>kafka_2.10</artifactId>
  <version>0.8.1.1</version>
  <scope>compile</scope>
  <exclusions>
    <exclusion>
      <artifactId>jmxri</artifactId>
      <groupId>com.sun.jmx</groupId>
    </exclusion>
    <exclusion>
      <artifactId>jms</artifactId>
      <groupId>javax.jms</groupId>
    </exclusion>
    <exclusion>
      <artifactId>jmxtools</artifactId>
      <groupId>com.sun.jdmk</groupId>
    </exclusion>
  </exclusions>
 </dependency> -->

 <!--<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2.0</version>
</dependency>-->

<!-- <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka_2.10</artifactId>
    <version>1.2.0</version>
</dependency>-->

本文收集自互联网,转载请注明来源。

如有侵权,请联系 [email protected] 删除。

编辑于
0

我来说两句

0 条评论
登录 后参与评论

相关文章