2017-07-17 287 views
0

我试图通过Apache Spark master从Mongo DB读取数据。无法通过Spark连接到Mongo DB

我使用3台本:

  • M1 - 它
  • M2与蒙戈DB实例 - 用星火硕士学位,蒙戈连接器,它
  • M3运行 - 与连接到M2的星火主Python应用程序

应用(M3)越来越像这样的火花掌握的连接:

_sparkSession = SparkSession.builder.master(masterPath).appName(appName)\ 
.config("spark.mongodb.input.uri", "mongodb://10.0.3.150/db1.data.coll")\ 
.config("spark.mongodb.output.uri", "mongodb://10.0.3.150/db1.data.coll").getOrCreate() 

应用(M3)正试图从数据库中读取数据:

sqlContext = SQLContext(_sparkSession.sparkContext) 
     df = sqlContext.read.format("com.mongodb.spark.sql.DefaultSource").option("uri","mongodb://user:[email protected]/db1.data?readPreference=primaryPreferred").load() 

但没有与此异常:

py4j.protocol.Py4JJavaError: An error occurred while calling o56.load. 
: java.lang.ClassNotFoundException: Failed to find data source: com.mongodb.spark.sql.DefaultSource. Please find packages at http://spark.apache.org/third-party-projects.html 
     at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:594) 
     at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86) 
     at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86) 
     at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:325) 
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152) 
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 
     at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 
     at py4j.Gateway.invoke(Gateway.java:280) 
     at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 
     at py4j.commands.CallCommand.execute(CallCommand.java:79) 
     at py4j.GatewayConnection.run(GatewayConnection.java:214) 
     at java.lang.Thread.run(Thread.java:748) 
Caused by: java.lang.ClassNotFoundException: com.mongodb.spark.sql.DefaultSource.DefaultSource 
     at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
     at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
     at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
     at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:579) 
     at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:579) 
     at scala.util.Try$.apply(Try.scala:192) 
     at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:579) 
     at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:579) 
     at scala.util.Try.orElse(Try.scala:84) 
     at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:579) 
     ... 16 more 

回答

3

星火找不到com.mongodb.spark.sql.DefaultSource包,因此,错误信息。

一切,否则看起来不错只需要包括蒙戈星火包装:

> $SPARK_HOME/bin/pyspark --packages org.mongodb.spark:mongo-spark-connector_2.11:2.2.0 

或确保该jar文件是正确的道路上。

请务必检查你的星火版本所需的蒙戈 - 星火包的版本:https://spark-packages.org/package/mongodb/mongo-spark

+0

谢谢您的回答。我指定我通过远程Python应用程序运行应用程序,而不是通过PySpark shell运行。因此,作为noob python开发人员,我再次问,如何使用连接器包运行我的应用程序。或者我需要运行包装的火花大师? –

+0

请更新问题,提供更多关于如何提交Spark任务的信息,我会着手更新我的答案。 – Ross

+1

我改变了我使用火花大师的方式。我发起了Spark主人及其奴隶。之后,我使用mongo-spark-connector软件包和python脚本运行spark-submit。猜猜这是推荐的方法。感谢所有 –