我以前能够加载这个MongoDB数据库,但我现在收到一个我一直无法弄清的错误。MongoDB Spark连接器py4j.protocol.Py4JJavaError:调用o50时发生错误.load
这里是我开始我的星火会话:
spark = SparkSession.builder \
.master("local[*]") \
.appName("collab_rec") \
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/example.collection") \
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/example.collection") \
.getOrCreate()
我运行此脚本,这样我可以通过IPython中至极的火花互动加载蒙戈火花连接器包:
#!/bin/bash
export PYSPARK_DRIVER_PYTHON=ipython
${SPARK_HOME}/bin/pyspark \
--master local[4] \
--executor-memory 1G \
--driver-memory 1G \
--conf spark.sql.warehouse.dir="file:///tmp/spark-warehouse" \
--packages com.databricks:spark-csv_2.11:1.5.0 \
--packages com.amazonaws:aws-java-sdk-pom:1.10.34 \
--packages org.apache.hadoop:hadoop-aws:2.7.3 \
--packages org.mongodb.spark:mongo-spark-connector_2.11:2.0.0\
星火负荷很好,它看起来包装正确加载。
这里是我尝试在数据库加载到数据帧:
df = spark.read.format("com.mongodb.spark.sql.DefaultSource").load()
然而,在该行,我收到以下错误:
Py4JJavaError: An error occurred while calling o46.load.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.analysis.TypeCoercion$.findTightestCommonTypeOfTwo()Lscala/Function2;
at com.mongodb.spark.sql.MongoInferSchema$.com$mongodb$spark$sql$MongoInferSchema$$compatibleType(MongoInferSchema.scala:132)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$3.apply(MongoInferSchema.scala:76)
at com.mongodb.spark.sql.MongoInferSchema$$anonfun$3.apply(MongoInferSchema.scala:76)
从我可以通过看以下文档/教程我试图正确加载数据帧:
https://docs.mongodb.com/spark-connector/master/python-api/
我使用Spark 2.2.0 请注意,我已经能够通过AWS在我的Mac和Linux上复制此错误。
谢谢你,你刚刚救了我很多时间! – user2359459