我想测试简单的Spark行数测试Cassandra表上只有四行只是为了验证一切正常。Python - > Py4j - > Spark - > Cassandra
我可以很快得到这个从Java工作:
JavaSparkContext sc = new JavaSparkContext(conf);
SparkContextJavaFunctions sparkContextJavaFunctions = CassandraJavaUtil.javaFunctions(sc);
CassandraJavaRDD<CassandraRow> table = sparkContextJavaFunctions.cassandraTable("demo", "playlists");
long count = table.count();
现在,我想获得在Python工作同样的事情。 Spark发行版带有一套非捆绑的PySpark源代码,以使用Python中的Spark。它使用一个名为py4j的库来启动Java服务器并通过TCP网关编组java命令。我直接使用该网关来实现这一功能。
我指定了以下额外的罐子通过--driver类路径选项中的Java SparkSubmit主持人:
spark-cassandra-connector-java_2.11-1.2.0-rc1.jar
spark-cassandra-connector_2.11-1.2.0-rc1.jar
cassandra-thrift-2.1.3.jar
cassandra-clientutil-2.1.3.jar
cassandra-driver-core-2.1.5.jar
libthrift-0.9.2.jar
joda-convert-1.2.jar
joda-time-2.3.jar
这里是核心Python代码做行计数测试:
from pyspark.java_gateway import launch_gateway
jvm_gateway = launch_gateway()
sc = jvm_gateway.jvm.org.apache.spark.api.java.JavaSparkContext(conf)
spark_cass_functions = jvm_gateway.jvm.com.datastax.spark.connector.japi.CassandraJavaUtil.javaFunctions(sc)
table = spark_cass_functions.cassandraTable("demo", "playlists");
在最后一行,我得到以下错误:
py4j.protocol.Py4JJavaError: An error occurred while calling o5.cassandraTable.
: com.datastax.spark.connector.util.ConfigCheck$ConnectorConfigurationException: Invalid Config Variables
Only known spark.cassandra.* variables are allowed when using the Spark Cassandra Connector.
spark.cassandra.connection.conf.factory is not a valid Spark Cassandra Connector variable.
No likely matches found.
spark.cassandra.auth.conf.factory is not a valid Spark Cassandra Connector variable.
No likely matches found.
at com.datastax.spark.connector.util.ConfigCheck$.checkConfig(ConfigCheck.scala:38)
at com.datastax.spark.connector.rdd.CassandraRDD.<init>(CassandraRDD.scala:18)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.<init>(CassandraTableScanRDD.scala:59)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$.apply(CassandraTableScanRDD.scala:182)
at com.datastax.spark.connector.japi.SparkContextJavaFunctions.cassandraTable(SparkContextJavaFunctions.java:88)
at com.datastax.spark.connector.japi.SparkContextJavaFunctions.cassandraTable(SparkContextJavaFunctions.java:68)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:259)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Thread.java:745)
显然,有一些configurati开启或设置问题。我不知道如何合理地调试或调查,或者我可以尝试什么。任何拥有更多Cassandra/Python/Spark专业知识的人都可以提供一些建议吗?谢谢!
编辑:一位同事设置了一个spark-defaults.conf文件,该文件是其根源。我不完全理解为什么这会导致Python而不是Java导致的问题,但这并不重要。我不想要那个conf文件并将其解决。
正如我在OP中澄清的那样,它是一个同事文件,一个同事放在Spark目录中。一旦我清理完了,我的Python应用程序就开始工作了。谢谢! – clay 2015-03-30 22:28:58