2017-09-14 78 views
1

dsetool状态datastax - 无法连接到DSE资源管理器上的火花提交

DC: dc1  Workload: Cassandra  Graph: no 
====================================================== 
Status=Up/Down 
|/ State=Normal/Leaving/Joining/Moving 
-- Address   Load    Owns     VNodes            Rack   Health [0,1] 
UN 192.168.1.130  810.47 MiB  ?     256            2a   0.90 
UN 192.168.1.131  683.53 MiB  ?     256           2a   0.90 
UN 192.168.1.132  821.33 MiB  ?     256           2a   0.90 

DC: dc2  Workload: Analytics  Graph: no  Analytics Master: 192.168.2.131 
    ========================================================================================= 
Status=Up/Down 
|/ State=Normal/Leaving/Joining/Moving 
-- Address   Load    Owns     VNodes           Rack   Health [0,1] 
UN 192.168.2.130  667.05 MiB  ?     256           2a   0.90 
UN 192.168.2.131  845.48 MiB  ?     256           2a   0.90 
UN 192.168.2.132  887.92 MiB  ?     256           2a   0.90 

当我尝试启动火花提交工作

dse -u user -p password spark-submit --class com.sparkLauncher test.jar prf 

我得到出现以下错误(编辑)

ERROR 2017-09-14 20:14:14,174 org.apache.spark.deploy.rm.DseAppClient$ClientEndpoint: Failed to connect to DSE resource manager 
java.io.IOException: Failed to register with master: dse://? 

....

Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: The method DseResourceManager.registerApplication does not exist. Make sure that the required component for that method is active/enabled 

....

ERROR 2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application has been killed. Reason: Failed to connect to DSE resource manager: Failed to register with master: dse://? 
org.apache.spark.SparkException: Exiting due to error from cluster scheduler: Failed to connect to DSE resource manager: Failed to register with master: dse://? 

....

WARN 2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application ID is not initialized yet. 
ERROR 2017-09-14 20:14:14,384 org.apache.spark.SparkContext: Error initializing SparkContext. 
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem 

ERROR 2017-09-14 20:14:14,387 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application 
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem 

我可以证实,作为本文档中提到,我已授予的权限, https://docs.datastax.com/en/dse/5.1/dse-admin/datastax_enterprise/security/secAuthSpark.html 我在AWS上尝试了这一点,如果这有所作为,我可以确认节点之间的路由全部打开。 我可以从任何火花节点启动火花外壳,可以调出Spark UI,可以从cqlsh命令获取火花主人

任何指针都会有帮助,提前感谢!

回答

0

由于某种原因,我无法定位,我可以按群集模式运行它,但不能在客户端模式下运行

1

主地址必须指向启用了数据中心的有效数据中心中的一个或多个节点。

Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: 
The method DseResourceManager.registerApplication does not exist. 
Make sure that the required component for that method is active/enabled``` 

表示连接的节点未启用分析功能。

如果您从非分析节点运行,则必须仍指向主UI中的某个分析节点。

dse://[Spark node address[:port number]]?[parameter name=parameter value;]... 

默认情况下,dse://? URL连接到本地主机为它的初始聚类连接。

有关更多信息,请参阅documentation

+0

@DataStax!我正在运行来自主节点的spark-submit,那是你所指的? – avinash

+0

您可以从任何启用分析的节点运行它。如果您仍然收到该消息,则表示Analytics模块未运行。我会检查系统日志 – RussS

相关问题