2015-08-28 68 views
0

我很难让camel-hdfs2组件在Karaf 4.0 OSGi容器中正常工作。这是一个非常简单的骆驼路线,它从HDFS读取文件,并简单地将文件名写入/ tmp中的新文件。Karaf OSGi camel-hdfs2

我有它只是运行的主要方法(包括以下)工作Karaf OSGi容器之外,但是当我尝试和Karaf启动它,我得到:

java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFileSystem not found 
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1882) 
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2298) 
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2311) 
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90) 
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350) 
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332) 
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369) 
at cas.example.camel_hdfs.LocalRouteBuilder.start(LocalRouteBuilder.java:83) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.8.0_51] 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)[:1.8.0_51] 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.8.0_51] 
at java.lang.reflect.Method.invoke(Method.java:497)[:1.8.0_51] 
at org.apache.felix.scr.impl.helper.BaseMethod.invokeMethod(BaseMethod.java:231)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.helper.BaseMethod.access$500(BaseMethod.java:39)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.helper.BaseMethod$Resolved.invoke(BaseMethod.java:624)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.helper.BaseMethod.invoke(BaseMethod.java:508)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.helper.ActivateMethod.invoke(ActivateMethod.java:149)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.manager.SingleComponentManager.createImplementationObject(SingleComponentManager.java:315)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.manager.SingleComponentManager.createComponent(SingleComponentManager.java:127)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.manager.SingleComponentManager.getService(SingleComponentManager.java:871)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.manager.SingleComponentManager.getServiceInternal(SingleComponentManager.java:838)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.manager.AbstractComponentManager.activateInternal(AbstractComponentManager.java:850)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.manager.AbstractComponentManager.enable(AbstractComponentManager.java:419)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.config.ConfigurableComponentHolder.enableComponents(ConfigurableComponentHolder.java:376)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.BundleComponentActivator.initialize(BundleComponentActivator.java:172)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.BundleComponentActivator.<init>(BundleComponentActivator.java:120)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.Activator.loadComponents(Activator.java:258)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.Activator.access$000(Activator.java:45)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.scr.impl.Activator$ScrExtension.start(Activator.java:185)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.utils.extender.AbstractExtender.createExtension(AbstractExtender.java:259)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.utils.extender.AbstractExtender.modifiedBundle(AbstractExtender.java:232)[23:org.apache.felix.scr:1.8.2] 
at org.osgi.util.tracker.BundleTracker$Tracked.customizerModified(BundleTracker.java:479)[23:org.apache.felix.scr:1.8.2] 
at org.osgi.util.tracker.BundleTracker$Tracked.customizerModified(BundleTracker.java:414)[23:org.apache.felix.scr:1.8.2] 
at org.osgi.util.tracker.AbstractTracked.track(AbstractTracked.java:232)[23:org.apache.felix.scr:1.8.2] 
at org.osgi.util.tracker.BundleTracker$Tracked.bundleChanged(BundleTracker.java:443)[23:org.apache.felix.scr:1.8.2] 
at org.apache.felix.framework.util.EventDispatcher.invokeBundleListenerCallback(EventDispatcher.java:913)[org.apache.felix.framework-5.0.1.jar:] 
at org.apache.felix.framework.util.EventDispatcher.fireEventImmediately(EventDispatcher.java:834)[org.apache.felix.framework-5.0.1.jar:] 
at org.apache.felix.framework.util.EventDispatcher.fireBundleEvent(EventDispatcher.java:516)[org.apache.felix.framework-5.0.1.jar:] 
at org.apache.felix.framework.Felix.fireBundleEvent(Felix.java:4544)[org.apache.felix.framework-5.0.1.jar:] 
at org.apache.felix.framework.Felix.startBundle(Felix.java:2166)[org.apache.felix.framework-5.0.1.jar:] 
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:977)[org.apache.felix.framework-5.0.1.jar:] 
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundle(DirectoryWatcher.java:1245)[4:org.apache.felix.fileinstall:3.5.0] 
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundles(DirectoryWatcher.java:1217)[4:org.apache.felix.fileinstall:3.5.0] 
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startAllBundles(DirectoryWatcher.java:1207)[4:org.apache.felix.fileinstall:3.5.0] 
at org.apache.felix.fileinstall.internal.DirectoryWatcher.doProcess(DirectoryWatcher.java:504)[4:org.apache.felix.fileinstall:3.5.0] 
at org.apache.felix.fileinstall.internal.DirectoryWatcher.process(DirectoryWatcher.java:358)[4:org.apache.felix.fileinstall:3.5.0] 
at org.apache.felix.fileinstall.internal.DirectoryWatcher.run(DirectoryWatcher.java:310)[4:org.apache.felix.fileinstall:3.5.0] 
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFileSystem not found 
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1788) 
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1880) 
... 46 more 

我知道该类在运行时可用(59是我的包)。在这里面,我在RouteBuilder定义骆驼路线(如下图所示类),将利用骆驼HDFS组件:

[email protected]()> list 59 
START LEVEL 100 , List Threshold: 50 
ID | State | Lvl | Version  | Name 
--------------------------------------------------- 
59 | Active | 80 | 0.0.1.SNAPSHOT | cas-camel-hdfs 
[email protected]()> bundle:classes 59 | grep LocalFileSystem 
org/apache/hadoop/fs/LocalFileSystem.class 
org/apache/hadoop/fs/LocalFileSystemConfigKeys.class 
org/apache/hadoop/fs/RawLocalFileSystem$1.class 
org/apache/hadoop/fs/RawLocalFileSystem$DeprecatedRawLocalFileStatus.class 
org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileInputStream.class 
org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileOutputStream.class 
org/apache/hadoop/fs/RawLocalFileSystem.class 
[email protected]()> 

这里是我的RouteBuilder /活化剂:

package cas.example.camel_hdfs; 

import java.net.URI; 

import org.apache.camel.CamelContext; 
import org.apache.camel.builder.RouteBuilder; 
import org.apache.camel.main.Main; 
import org.apache.hadoop.conf.Configuration; 
import org.apache.hadoop.fs.FileSystem; 
import org.apache.hadoop.fs.LocalFileSystem; 
import org.apache.hadoop.hdfs.DistributedFileSystem; 
import org.osgi.framework.BundleContext; 

import aQute.bnd.annotation.component.Activate; 
import aQute.bnd.annotation.component.Component; 
import aQute.bnd.annotation.component.Deactivate; 

@Component 
public class LocalRouteBuilder extends RouteBuilder { 

    private final String hdfsHost; 
    private final String path; 
    private static final String MARKED_SUFFIX = "ingested"; 

    /** 
    * If running in OSGI... 
    */ 
    private CamelContext cContext = null; 

    public LocalRouteBuilder() { 
     this("10.10.1.20", "/user/cloud-user/cas-docs", "cloud-user"); 
    } 

    /** 
    * If you use this constructor, make sure the HADOOP_USER_NAME is set via a 
    * jvm property. 
    * 
    * @param hdfsHost 
    * @param path 
    */ 
    public LocalRouteBuilder(final String hdfsHost, final String path) { 
     this(hdfsHost, path, null); 
    } 

    /** 
    * 
    * @param hdfsHost 
    * @param path 
    * @param userName 
    */ 
    public LocalRouteBuilder(final String hdfsHost, final String path, final String userName) { 
     this.cContext = this.getContext(); 
     this.hdfsHost = hdfsHost; 
     this.path = path; 
     if (userName != null) { 
      System.setProperty("HADOOP_USER_NAME", userName); 
     } 
    } 

    /** 
    * {@inheritDoc} 
    */ 
    @Override 
    public void configure() throws Exception { 

     from("hdfs2://" + hdfsHost + path + "?delay=5000&chunkSize=4096&connectOnStartup=true&readSuffix=" + MARKED_SUFFIX) 

     .setBody(simple(path + "/${header[CamelFileName]}." + MARKED_SUFFIX)) 

     .to("log:cas.example.camel_hdfs.BasicRouteBuilder") 

     .to("file:/tmp/RECEIVED") 

     .stop().end(); 

    } 

    @Activate 
    public void start(BundleContext context) throws Exception { 
     Configuration conf = new Configuration(); 
     conf.setClass("fs.file.impl", LocalFileSystem.class, FileSystem.class); 
     conf.setClass("fs.hdfs.impl", DistributedFileSystem.class, FileSystem.class); 
     FileSystem.get(URI.create("file:///"), conf); 
     FileSystem.get(URI.create("hdfs://10.10.1.20:9000/"), conf); 

     if (cContext != null) { 
      cContext.stop(); 
      cContext = null; 
     } 
     // cContext = new OsgiDefaultCamelContext(context); 
     cContext.addRoutes(this); 
     cContext.start(); 
     cContext.startAllRoutes(); 
    } 

    @Deactivate 
    public void stop(BundleContext context) throws Exception { 
     System.out.println("Stopping hdfs camel bundle"); 
     if (cContext != null) { 
      cContext.stop(); 
      cContext = null; 
     } 
    } 

    public static void main(String[] args) { 
     try { 
      Main m = new Main(); 
      m.addRouteBuilder(new LocalRouteBuilder("10.10.1.20", "/user/cloud-user/cas-docs", "cloud-user")); 
      m.enableHangupSupport(); 
      m.enableTrace(); 
      m.run(); 
     } catch (Exception e) { 
      e.printStackTrace(); 
      System.exit(-1); 
     } 
    } 

} 

刚如果有帮助,这里是捆绑清单:

[email protected]()> list 
START LEVEL 100 , List Threshold: 50 
ID | State | Lvl | Version   | Name 
---------------------------------------------------------------------------------------------- 
58 | Active | 80 | 0.0.1.SNAPSHOT  | karaf-feature-export 
59 | Active | 80 | 0.0.1.SNAPSHOT  | cas-camel-hdfs 
60 | Active | 80 | 2.4.0.201411031534 | bndlib 
61 | Active | 80 | 2.15.2    | camel-blueprint 
62 | Active | 80 | 2.15.2    | camel-catalog 
63 | Active | 80 | 2.15.2    | camel-commands-core 
64 | Active | 80 | 2.15.2    | camel-core 
65 | Active | 80 | 2.15.2    | camel-spring 
66 | Active | 80 | 2.15.2    | camel-karaf-commands 
67 | Active | 80 | 1.1.1    | geronimo-jta_1.1_spec 
72 | Active | 80 | 2.2.6.1   | Apache ServiceMix :: Bundles :: jaxb-impl 
84 | Active | 80 | 3.1.4    | Stax2 API 
85 | Active | 80 | 4.4.1    | Woodstox XML-processor 
86 | Active | 80 | 2.15.2    | camel-core-osgi 
87 | Active | 80 | 18.0.0    | Guava: Google Core Libraries for Java 
88 | Active | 80 | 2.6.1    | Protocol Buffer Java API 
89 | Active | 80 | 1.9.12    | Jackson JSON processor 
90 | Active | 80 | 1.9.12    | Data mapper for Jackson JSON processor 
91 | Active | 80 | 2.15.2    | camel-hdfs2 
92 | Active | 80 | 1.2    | Commons CLI 
93 | Active | 80 | 1.10.0    | Apache Commons Codec 
94 | Active | 80 | 3.2.1    | Commons Collections 
95 | Active | 80 | 1.5.0    | Commons Compress 
96 | Active | 80 | 1.9.0    | Commons Configuration 
97 | Active | 80 | 2.4.0    | Commons IO 
98 | Active | 80 | 2.6    | Commons Lang 
99 | Active | 80 | 3.3.0    | Apache Commons Math 
100 | Active | 80 | 3.3.0    | Commons Net 
101 | Active | 80 | 3.4.6    | ZooKeeper Bundle 
102 | Active | 80 | 1.7.7.1   | Apache ServiceMix :: Bundles :: avro 
103 | Active | 80 | 3.1.0.7   | Apache ServiceMix :: Bundles :: commons-httpclient 
104 | Active | 80 | 3.0.0.1   | Apache ServiceMix :: Bundles :: guice 
105 | Active | 80 | 2.3.0.2   | Apache ServiceMix :: Bundles :: hadoop-client 
106 | Active | 80 | 0.1.51.1   | Apache ServiceMix :: Bundles :: jsch 
107 | Active | 80 | 2.6.0.1   | Apache ServiceMix :: Bundles :: paranamer 
108 | Active | 80 | 0.52.0.1   | Apache ServiceMix :: Bundles :: xmlenc 
109 | Active | 80 | 1.2.0.5   | Apache ServiceMix :: Bundles :: xmlresolver 
110 | Active | 80 | 3.9.6.Final  | Netty 
111 | Resolved | 80 | 1.1.0.1   | Snappy for Java 
[email protected]()> 

感谢您的帮助!

-Ben

编辑:

所以,我加入了丛标头为我定制的捆绑(我做了karaf干净,所以包ID改为从39到109)。

[email protected]()> bundle:headers 109 

cas-camel-hdfs (109) 
-------------------- 
Bnd-LastModified = 1440904390702 
Build-Jdk = 1.8.0_51 
Built-By = bdgould 
Created-By = Apache Maven Bundle Plugin 
Manifest-Version = 1.0 
Service-Component = OSGI-INF/cas.example.camel_hdfs.Hdfs2RouteBuilder.xml,OSGI-INF/cas.example.camel_hdfs.SimpleRouteBuilder.xml 
Tool = Bnd-2.4.1.201501161923 

Bundle-ManifestVersion = 2 
Bundle-Name = cas-camel-hdfs 
Bundle-SymbolicName = com.inovexcorp.cas_cas-camel-hdfs 
Bundle-Version = 0.0.1.SNAPSHOT 

Require-Capability = 
    osgi.ee;filter:=(&(osgi.ee=JavaSE)(version=1.8)) 

DynamicImport-Package = 
    * 
Export-Package = 
    cas.example.camel_hdfs;uses:="org.apache.camel.builder,org.osgi.framework";version=0.0.1.SNAPSHOT 
Import-Package = 
    org.apache.camel;version="[2.15,3)", 
    org.apache.camel.builder;version="[2.15,3)", 
    org.apache.camel.main;version="[2.15,3)", 
    org.apache.camel.model;version="[2.15,3)", 
    org.apache.hadoop.conf, 
    org.apache.hadoop.fs, 
    org.apache.hadoop.hdfs, 
    org.osgi.framework;version="[1.6,2)", 
    org.apache.camel.component.hdfs2;version="[2.15,3)" 

我仍然不知道为什么它找不到LocalFileSystem类,因为它肯定来自出口:

102 | Active | 80 | 2.3.0.2   | Apache ServiceMix :: Bundles :: hadoop-client 

这是安装为骆驼hdfs2功能的一部分Hadoop的包。

编辑2: 嗯,我实际上不知道为什么bundle:classes向我展示所有这些类。我只是打开了我的JAR,而我看到的是:

" zip.vim version v27 
" Browsing zipfile /opt/apache-karaf-4.0.1/deploy/cas-camel-hdfs-0.0.1-SNAPSHOT.jar 
" Select a file with cursor and press ENTER 

META-INF/MANIFEST.MF 
META-INF/ 
META-INF/maven/ 
META-INF/maven/com.inovexcorp.cas/ 
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/ 
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/pom.properties 
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/pom.xml 
META-INF/services/ 
META-INF/services/org.apache.hadoop.fs.FileSystem 
OSGI-INF/ 
OSGI-INF/cas.example.camel_hdfs.Hdfs2RouteBuilder.xml 
OSGI-INF/cas.example.camel_hdfs.SimpleRouteBuilder.xml 
cas/ 
cas/example/ 
cas/example/camel_hdfs/ 
cas/example/camel_hdfs/Hdfs2RouteBuilder.class 
cas/example/camel_hdfs/SimpleRouteBuilder.class 
core-default.xml 
hdfs-default.xml 
log4j.xml 

在Karaf列出的类似乎并不匹配什么在JAR实际上是(但必须看到的班我捆引用? )。这里是我的聚甲醛,以防万一它帮助:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 
    <groupId>com.inovexcorp.cas</groupId> 
    <artifactId>cas-camel-hdfs</artifactId> 
    <version>0.0.1-SNAPSHOT</version> 
    <name>Example Camel-HDFS Integration</name> 
    <packaging>bundle</packaging> 

    <properties> 
     <project.build.sourceEncoding>UTF8</project.build.sourceEncoding> 
     <camel.version>2.15.2</camel.version> 
    </properties> 

    <dependencies> 
     <dependency> 
      <groupId>org.apache.camel</groupId> 
      <artifactId>camel-core</artifactId> 
      <version>${camel.version}</version> 
      <scope>provided</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.camel</groupId> 
      <artifactId>camel-hdfs2</artifactId> 
      <version>${camel.version}</version> 
      <scope>provided</scope> 
     </dependency><!-- <dependency> <groupId>org.apache.camel</groupId> <artifactId>camel-core-osgi</artifactId> 
      <version>${camel.version}</version> </dependency> --> 
     <dependency> 
      <groupId>log4j</groupId> 
      <artifactId>log4j</artifactId> 
      <version>1.2.17</version> 
      <scope>provided</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.ops4j.pax.logging</groupId> 
      <artifactId>pax-logging-api</artifactId> 
      <version>1.7.0</version> 
      <scope>provided</scope> 
     </dependency> 
     <dependency> 
      <groupId>biz.aQute.bnd</groupId> 
      <artifactId>bndlib</artifactId> 
      <version>2.3.0</version> 
      <scope>provided</scope> 
     </dependency> 
    </dependencies> 


    <build> 
     <plugins> 
      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-compiler-plugin</artifactId> 
       <version>3.3</version> 
       <configuration> 
        <!-- http://maven.apache.org/plugins/maven-compiler-plugin/ --> 
        <source>1.8</source> 
        <target>1.8</target> 
       </configuration> 
      </plugin> 
      <plugin> 
       <groupId>org.apache.felix</groupId> 
       <artifactId>maven-bundle-plugin</artifactId> 
       <extensions>true</extensions> 
       <configuration> 
        <instructions> 
         <Bundle-SymbolicName>${project.groupId}_${project.artifactId}</Bundle-SymbolicName> 
         <Bundle-Name>${project.artifactId}</Bundle-Name> 
         <Bundle-Version>${project.version}</Bundle-Version> 
         <Import-Package>org.apache.camel.component.hdfs2,*;resolution:=required</Import-Package> 
         <Service-Component>*</Service-Component> 
        </instructions> 
       </configuration> 
      </plugin> 
     </plugins> 
    </build> 

</project> 

回答

0

看起来像一个经典的进口包装/出口包装的问题。 你的捆绑包59明显是否包含有问题的类实际导出它?

Export-Package: org.apache.hadoop.fs 

更新后: 显然导入正确的包,但根据您的房源

59 | Active | 80 | 0.0.1.SNAPSHOT | cas-camel-hdfs 
[email protected]()> bundle:classes 59 | grep LocalFileSystem 
org/apache/hadoop/fs/LocalFileSystem.class 
org/apache/hadoop/fs/LocalFileSystemConfigKeys.class 
org/apache/hadoop/fs/RawLocalFileSystem$1.class 
org/apache/hadoop/fs/RawLocalFileSystem$DeprecatedRawLocalFileStatus.class 
org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileInputStream.class 
org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileOutputStream.class 
org/apache/hadoop/fs/RawLocalFileSystem.class 

自己的包还包含这些类,所以一定要确保你不沿着它们打包。 很可能你的依赖于org.apache.servicemix.bundles.hadoop-client在你的pom中被标记为编译依赖。

+0

所以,59实际上是我的包将进口包。我会稍微更新我的问题,以便更清楚。我的软件包正在使用camel-hdfs功能包,并导入我认为必要的软件包。 – bdgould

+0

capabilities 102 karaf @ root()> capabilities 102 org.apache.servicemix.bundles.hadoop-client [102]规定: ... osgi.wiring.package; org.apache.hadoop.fs 2.3.0所需: com.inovexcorp.cas_cas-camel-hdfs [109] org.apache.camel.camel-hdfs2 [88] ... – bdgould

+0

我其实不确定为什么karaf bundle:classes正在显示这些类(除非它真的在查看我的bundle引用的所有类)。我已经用POM和我的JAR中的内容(通过vim)更新了这个问题。谢谢 - 本 – bdgould