2
我正在用mongodb连接器运行火星壳。但该计划非常缓慢,我想我不会有计划的回应。Mongodb的火花很慢
我的火花shell命令是:
./spark-shell --master spark://spark_host:7077 \
--conf "spark.mongodb.input.uri=mongodb://mongod_user:[email protected]_host:27017/database.collection?readPreference=primaryPreferred" \
--jars /mongodb/lib/mongo-spark-connector_2.10-2.0.0.jar,/mongodb/lib/bson-3.2.2.jar,/mongodb/lib/mongo-java-driver-3.2.2.jar
而且我的应用程序的代码是:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import com.mongodb.spark._
import org.bson.Document
import com.mongodb.spark.config.ReadConfig
import org.apache.spark.sql.SparkSession
import com.mongodb.spark.rdd.MongoRDD
val sparkSession = SparkSession.builder().getOrCreate()
val df = MongoSpark.load(sparkSession)
val dataset = df.filter("thisRequestTime > 1499250131596")
dataset.first // will wait to long time
我是错过了什么事情?请帮助我〜 PS:我的火花是独立模式。应用程序依赖是:
<properties>
<encoding>UTF-8</encoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<scala.compat.version>2.11</scala.compat.version>
<spark.version>2.1.1</spark.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.mongodb.spark</groupId>
<artifactId>mongo-spark-connector_${scala.compat.version}</artifactId>
<version>2.0.0</version>
</dependency>
</dependencies>
您期望的数据集有多大?查询在MongoDB上运行多长时间? –
谢谢你的回复@Rick Moritz。总文件数是194920414,在MongoDB中。满足过滤条件的doc数量为749216.火花应用半小时后我得到响应。但是在mongodb shell中,我以毫秒为单位获得了相同条件的响应。 – Milk
PS:我有一个在mongodb文档的条件字段索引 – Milk