2017-10-09 1284 views
0

我正尝试使用flink从kafka流式传输数据。我的代码在编译时没有错误,但运行出现以下错误:当与kafka连接器运行flink时出现NoClassDefFoundError

Error: A JNI error has occurred, please check your installation and try again 
Exception in thread "main" java.lang.NoClassDefFoundError: 
    org/apache/flink/streaming/util/serialization/DeserializationSchema 
    at java.lang.Class.getDeclaredMethods0(Native Method) 
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2701) 
    at java.lang.Class.privateGetMethodRecursive(Class.java:3048) 
    at java.lang.Class.getMethod0(Class.java:3018) 
    at java.lang.Class.getMethod(Class.java:1784) 
    at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544) 
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526) 
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.util.serialization.DeserializationSchema 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
    ... 7 more 

我的POM依赖列表如下:

<dependencies> 
     <dependency> 
      <groupId>org.apache.flink</groupId> 
      <artifactId>flink-java</artifactId> 
      <version>1.3.2</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.flink</groupId> 
      <artifactId>flink-streaming-core</artifactId> 
      <version>0.9.1</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.flink</groupId> 
      <artifactId>flink-clients</artifactId> 
      <version>0.10.2</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.flink</groupId> 
      <artifactId>flink-connector-kafka-0.9_2.11</artifactId> 
      <version>1.3.2</version> 
     </dependency> 
     <dependency> 
      <groupId>com.googlecode.json-simple</groupId> 
      <artifactId>json-simple</artifactId> 
      <version>1.1</version> 
     </dependency> 
    </dependencies> 

,我想只是运行Java代码订阅卡夫卡题目叫 '流光':

import java.util.Properties; 
import java.util.Arrays; 
import org.apache.flink.api.common.functions.MapFunction; 
import org.apache.flink.api.java.utils.ParameterTool; 
import org.apache.flink.streaming.api.datastream.DataStream; 
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; 
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09; 
import org.apache.flink.streaming.util.serialization.SimpleStringSchema; 
import org.apache.flink.streaming.util.serialization.DeserializationSchema; 

public class StreamConsumer { 
public static void main(String[] args) throws Exception { 
     StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); 
     Properties properties = new Properties(); 
     properties.setProperty("bootstrap.servers", "localhost:9092"); 
     properties.setProperty("group.id", "samplegroup"); 

     DataStream<String> messageStream = env.addSource(new FlinkKafkaConsumer09<String>("streamer", new SimpleStringSchema(), properties)); 

     messageStream.rebalance().map(new MapFunction<String, String>() { 
         private static final long serialVersionUID = -6867736771747690202L; 
         @Override 
         public String map(String value) throws Exception { 
           return "Streamed data: " + value; 
         } 
       }).print(); 
     env.execute(); 
} 
} 

系统信息:
1.卡夫卡版本:0.9.0.1
2.弗林克版本:1.3.2
3. OpenJDK的版本:1.8

虽然我使用maven,我不认为这是任何Maven的问题,因为我得到了同样的错误,甚至当我尝试没有行家。我手动将所有必需的.jar文件下载到一个文件夹,并在使用javac编译时使用-cp选项指定该文件夹路径。在运行期间,我得到与上面相同的错误,但在编译期间没有错误。

回答

0

它看起来像你的POM的第一个问题是,你使用不同的版本进行flink导入。尝试对所有flink模块使用新版本1.3.2。当您使用不兼容或多个版本的库时,经常会发生此错误。

尝试使用休耕依赖(假设你使用的Scala 2.11):

<dependencies> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-java</artifactId> 
     <version>1.3.2</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-streaming-java_2.11</artifactId> 
     <version>1.3.2</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-clients_2.11</artifactId> 
     <version>1.3.2</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-connector-kafka-0.9_2.11</artifactId> 
     <version>1.3.2</version> 
    </dependency> 
    <dependency> 
     <groupId>com.googlecode.json-simple</groupId> 
     <artifactId>json-simple</artifactId> 
     <version>1.1</version> 
    </dependency> 
</dependencies> 

如果仍然有同样的问题后示例代码,所以我可以重现错误。

+0

我想,限制类文件的范围,但这里提到的版本是最新的。 – raviabhiram

+0

flink-streaming-core不再存在,并且被flink-streaming-java_ 包含。 flink-clients现在也有一个scala后缀。 –

+0

我尝试过使用这些依赖关系,但仍然有相同的错误。我已经添加了上面的Java代码。 – raviabhiram

0

NoClassDefFoundError when running flink with kafka connector

你的代码的代码编译和你得到的NoClassDefFoundError,我认为 一个你所依赖的库缺少它编译依赖运行时依赖期间行家自动下载过程.pom

所以可能这是造成您的原因NoClassDefFoundError

Solution: clean and build

+0

我用命令'mvn clean package'构建 – raviabhiram

相关问题