1
我有一个一直设置一个EC2服务器上进行如下的码头工人,火花提交失败。当罐子是S3
docker exec -it master bin/spark-submit --master spark://0.0.0.0:7077 --verbose --class my/class s3://myBucket/path
下面是从运行的打印输出:
Warning: Skip remote jar s3://myBucket/MyBin.
java.lang.ClassNotFoundException: my/class
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:228)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:693)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
几件事情要检查,你可以验证罐子正在下载?如果不是作为临时措施,只是为了查看是否存在权限/网络问题而将其公开访问? – ImDarrenG