2017-07-28 100 views
1

我在OpenStack上安装了DC/OS VERSION 1.9.2。 我尝试在DC/OS上安装apache spark。无法在DCOS上安装Apache Spark

dcos package install spark 
Installing Marathon app for package [spark] version [1.1.0-2.1.1] 
Installing CLI subcommand for package [spark] version [1.1.0-2.1.1] 
New command available: dcos spark 
DC/OS Spark is being installed! 

但是DC/OS仪表板显示Spark正在部署并且任务没有运行。 错误showw此消息。

I0728 16:43:36.348244 14038 exec.cpp:162] Version: 1.2.2 
I0728 16:43:36.656839 14046 exec.cpp:237] Executor registered on agent abf187f4-ad7d-4ead-9437-5cdba4f77bdc-S1 
+ export DISPATCHER_PORT=24238 
+ DISPATCHER_PORT=24238 
+ export DISPATCHER_UI_PORT=24239 
+ DISPATCHER_UI_PORT=24239 
+ export SPARK_PROXY_PORT=24240 
+ SPARK_PROXY_PORT=24240 
+ SCHEME=http 
+ OTHER_SCHEME=https 
+ [[ '' == true ]] 
+ export DISPATCHER_UI_WEB_PROXY_BASE=/service/spark 
+ DISPATCHER_UI_WEB_PROXY_BASE=/service/spark 
+ grep -v '#https#' /etc/nginx/conf.d/spark.conf.template 
+ sed s,#http#,, 
+ sed -i 's,<PORT>,24240,' /etc/nginx/conf.d/spark.conf 
+ sed -i 's,<DISPATCHER_URL>,http://172.16.129.180:24238,' /etc/nginx/conf.d/spark.conf 
+ sed -i 's,<DISPATCHER_UI_URL>,http://172.16.129.180:24239,' /etc/nginx/conf.d/spark.conf 
+ sed -i 's,<PROTOCOL>,,' /etc/nginx/conf.d/spark.conf 
+ [[ '' == true ]] 
+ [[ -f hdfs-site.xml ]] 
+ [[ -n '' ]] 
+ exec runsvdir -P /etc/service 
+ + mkdirmkdir -p -p /mnt/mesos/sandbox/nginx /mnt/mesos/sandbox/spark 

+ exec 
+ exec svlogd /mnt/mesos/sandbox/nginx 
+ exec svlogd /mnt/mesos/sandbox/spark 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 

如何在DCOS上运行Spark任务。 谢谢。

+0

我登录引发泊坞窗。我试着将type_hash_buck_size改为64.部署完成。但是这个过程并不酷。 –

+0

我检查了DCOS日志。我猜这个部署已经过期,但Docker中的NGINX无法运行。 DCOS检查了火花健康状况,但NGINX没有回应。 DCOS认识到火花不会死亡并杀死火花。 Spark尝试再次部署。现在,这个过程是无止境的。 –

回答

0

检查您是否正确卸载了以前的火花安装。你必须删除旧的动物园管理员spark项目(在/参展商)。

同时检查是否没有任何僵尸框架阻止新部署的资源。杀了他们:

http://MESOSMASTER_URL:5050/master/teardown -d 'frameworkId='<FRAMEWORKID>'' 
+0

我试图卸载spark,并删除旧的zookeeper spark条目。我查了僵尸框架,我coluld找不到它。 –