2017-07-03 83 views
0

我可以使用JDBCIO和Avro Coder访问我的mysql表。现在我正在尝试使用JdbcIO加载我的配置单元数据库。 从数据流连接到配置单元时抛出以下异常。任何梁怪才的帮助将是非常有帮助的。在apache中使用JdbcIO访问Hive抛出java.lang.NoClassDefFoundError:org/apache/avro/reflect/AvroSchema

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/reflect/AvroSchema 
at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.recurse(AvroCoder.java:426) 
at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.check(AvroCoder.java:419) 
at org.apache.beam.sdk.coders.AvroCoder.<init>(AvroCoder.java:259) 
at org.apache.beam.sdk.coders.AvroCoder.of(AvroCoder.java:120) 
at com.google.cloud.bigquery.csv.loader.GoogleSQLPipeline.main(GoogleSQLPipeline.java:101) 
Caused by: java.lang.ClassNotFoundException: org.apache.avro.reflect.AvroSchema 
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335) 
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
... 5 more 

下面的代码片段试图访问蜂巢:

dataflowPipeline 
      .apply(JdbcIO.<Customer>read() 
        .withDataSourceConfiguration(JdbcIO.DataSourceConfiguration 
          .create("org.apache.hive.jdbc.HiveDriver", "jdbc:hive2://<ip>/mydb") 
          .withUsername("username").withPassword("password")) 
        .withQuery(
          "select c_customer_id,c_first_name,c_last_name,c_preferred_cust_flag,c_birth_day,c_birth_month,c_birth_year,c_birth_country,c_customer_sk,c_current_cdemo_sk,c_current_hdemo_sk from customer") 
        .withRowMapper(new JdbcIO.RowMapper<Customer>() { 
         @Override 
         public Customer mapRow(ResultSet resultSet) throws Exception 

POM依赖条件:


<dependencies> 
    <dependency> 
     <groupId>com.google.cloud.dataflow</groupId> 
     <artifactId>google-cloud-dataflow-java-sdk-all</artifactId> 
     <version>2.0.0</version> 
    </dependency> 
    <!-- https://mvnrepository.com/artifact/org.apache.beam/beam-sdks-java-io-jdbc --> 
    <dependency> 
     <groupId>org.apache.beam</groupId> 
     <artifactId>beam-sdks-java-io-jdbc</artifactId> 
     <version>2.0.0</version> 
    </dependency> 

    <!-- https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc --> 
    <dependency> 
     <groupId>org.apache.hive</groupId> 
     <artifactId>hive-jdbc</artifactId> 
     <version>1.2.1</version> 
    </dependency> 
    <dependency> 
     <groupId>jdk.tools</groupId> 
     <artifactId>jdk.tools</artifactId> 
     <version>1.8.0_131</version> 
     <scope>system</scope> 
     <systemPath>${JAVA_HOME}/lib/tools.jar</systemPath> 
    </dependency> 

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common --> 
    <dependency> 
     <groupId>org.apache.hadoop</groupId> 
     <artifactId>hadoop-common</artifactId> 
     <version>2.8.1</version> 
    </dependency> 

    <dependency> 
     <groupId>com.google.guava</groupId> 
     <artifactId>guava</artifactId> 
     <version>18.0</version> 
    </dependency> 
    <!-- slf4j API frontend binding with JUL backend --> 
    <dependency> 
     <groupId>org.slf4j</groupId> 
     <artifactId>slf4j-jdk14</artifactId> 
     <version>1.7.14</version> 
    </dependency> 
</dependencies> 
+0

增加了对Avro 1.8.1的依赖和手动导入的导入org.apache.avro.reflect.AvroSchema;现在问题解决了。再次,与com.google.protobuf.GeneratedMessageV3相关的问题可以通过手动导入此类来解决。 – Balu

回答

0

增加依赖于Avro的1.8.1和手动导入进口org.apache .avro.reflect.AvroSchema;现在问题解决了。再次,与com.google.protobuf.GeneratedMessageV3相关的问题可以通过手动导入此类来解决。

相关问题