2017-03-04 94 views
1

PredictionIO-0.10.0-在ubuntu 16.04上孵化返回错误“线程中的异常”main“java.sql.SQLException:No当我运行 bin/pio import --appid 1 --input engine/data/stopwords.json时发现了适用于jdbc:postgresql:// localhost/pio的驱动程序PredictionIO-0.10.0-incubating没有找到适用于jdbc的驱动程序:postgresql:// localhost/pio

pio在哪里寻找驱动程序?

当我运行bin/pio状态时没有问题。

[INFO] [Console$] Inspecting PredictionIO... 
[INFO] [Console$] PredictionIO 0.10.0-incubating is installed at /home/homedir/mnt/predictionio/apache-predictionio-0.10.0-incubating/PredictionIO-0.10.0-incubating 
[INFO] [Console$] Inspecting Apache Spark... 
[INFO] [Console$] Apache Spark is installed at /home/homedir/mnt/predictionio/apache-predictionio-0.10.0-incubating/PredictionIO-0.10.0-incubating/vendors/spark-1.5.1-bin-hadoop2.6 
[INFO] [Console$] Apache Spark 1.5.1 detected (meets minimum requirement of 1.3.0) 
[INFO] [Console$] Inspecting storage backend connections... 
[INFO] [Storage$] Verifying Meta Data Backend (Source: PGSQL)... 
[INFO] [Storage$] Verifying Model Data Backend (Source: PGSQL)... 
[INFO] [Storage$] Verifying Event Data Backend (Source: PGSQL)... 
[INFO] [Storage$] Test writing to Event Store (App Id 0)... 
[INFO] [Console$] (sleeping 5 seconds for all messages to show up...) 
[INFO] [Console$] Your system is all ready to go. 

PostgreSQL的JDBC JAR安装在/usr/lib/jvm/postgresql-42.0.0.jar

bin/pio import --appid 1 --input engine/data/stopwords.json 
    [INFO] [Remoting] Starting remoting 
    [INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp://[email protected]:41940] 
    [WARN] [MetricsSystem] Using default name DAGScheduler for source because spark.app.id is not set. 
    Exception in thread "main" java.sql.SQLException: No suitable driver found for jdbc:postgresql://localhost/pio 
      at java.sql.DriverManager.getConnection(DriverManager.java:689) 
      at java.sql.DriverManager.getConnection(DriverManager.java:208) 
      at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188) 
      at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181) 
      at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createConnection(JdbcUtils.scala:39) 
      at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:253) 
      at org.apache.predictionio.data.storage.jdbc.JDBCPEvents.write(JDBCPEvents.scala:162) 
      at org.apache.predictionio.tools.imprt.FileToEvents$$anonfun$main$1.apply(FileToEvents.scala:101) 
      at org.apache.predictionio.tools.imprt.FileToEvents$$anonfun$main$1.apply(FileToEvents.scala:68) 
      at scala.Option.map(Option.scala:145) 
      at org.apache.predictionio.tools.imprt.FileToEvents$.main(FileToEvents.scala:68) 
      at org.apache.predictionio.tools.imprt.FileToEvents.main(FileToEvents.scala) 
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
      at java.lang.reflect.Method.invoke(Method.java:498) 
      at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) 
      at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) 
      at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) 
      at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) 
      at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

回答

0

默认,它看起来在目录'$ PIO_HOME/lib/postgresql-9.4-1204.jdbc41.jar'下。

这是按照文件$ PIO_HOME/conf/pio-env.sh中的定义。例如:POSTGRES_JDBC_DRIVER = $ PIO_HOME/lib/postgresql-9.4-1204.jdbc41.jar

0

告诉PIO在哪里可以找到罐子

bin/pio import --appid 1 --input engine/data/emails.json -- --driver-class-path /usr/lib/jvm/postgresql-42.0.0.jar 
相关问题