CDHSpark2的spark2-submit的一个Nosuchfileordirectory问题怎么解决

这期内容当中小编将会给大家带来有关CDH Spark2的spark2-submit的一个No such file or directory问题怎么解决,文章内容丰富且以专业的角度为大家分析和叙述,阅读完这篇文章希望大家可以有所收获。

创新互联主打移动网站、网站设计制作、做网站、网站改版、网络推广、网站维护、主机域名、等互联网信息服务,为各行业提供服务。在技术实力的保障下,我们为客户承诺稳定,放心的服务,根据网站的内容与功能再决定采用什么样的设计。最后,要实现符合网站需求的内容、功能与设计,我们还会规划稳定安全的技术方案做保障。

运行:
在测试的CDH Spark2, 运行spark streaming,
命令如下:

  • spark2-submit \

  • --class com.telenav.dataplatform.demo.realtimecases.WeatherAlerts \

  • --master yarn --deploy-mode cluster \

  • /usr/local/sparkProject/realtimeCases-0.0.1-SNAPSHOT.jar



错误:

  • 17/03/02 21:01:56 INFO cluster.YarnClusterScheduler: Adding task set 0.0 with 1 tasks

  • 17/03/02 21:01:56 WARN net.ScriptBasedMapping: Exception running /etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py 172.16.102.64

  • java.io.IOException: Cannot run program "/etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py" (in directory "/yarn/nm/usercache/spark/appcache/application_1488459089260_0003/container_1488459089260_0003_01_000001"): error=2, No such file or directory

  •     at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)

  •     at org.apache.hadoop.util.Shell.runCommand(Shell.java:548)

  •     at org.apache.hadoop.util.Shell.run(Shell.java:504)

  •     at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:786)

  •     at org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:251)

  •     at org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:188)

  •     at org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:119)

  •     at org.apache.hadoop.yarn.util.RackResolver.coreResolve(RackResolver.java:101)

  •     at org.apache.hadoop.yarn.util.RackResolver.resolve(RackResolver.java:81)

  •     at org.apache.spark.scheduler.cluster.YarnScheduler.getRackForHost(YarnScheduler.scala:37)

  •     at org.apache.spark.scheduler.TaskSetManager$$anonfun$org$apache$spark$scheduler$TaskSetManager$$addPendingTask$1.apply(TaskSetManager.scala:201)

  •     at org.apache.spark.scheduler.TaskSetManager$$anonfun$org$apache$spark$scheduler$TaskSetManager$$addPendingTask$1.apply(TaskSetManager.scala:182)

  •     at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

  •     at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

  •     at org.apache.spark.scheduler.TaskSetManager.org$apache$spark$scheduler$TaskSetManager$$addPendingTask(TaskSetManager.scala:182)

  •     at org.apache.spark.scheduler.TaskSetManager$$anonfun$1.apply$mcVI$sp(TaskSetManager.scala:161)

  •     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)

  •     at org.apache.spark.scheduler.TaskSetManager.(TaskSetManager.scala:160)

  •     at org.apache.spark.scheduler.TaskSchedulerImpl.createTaskSetManager(TaskSchedulerImpl.scala:222)

  •     at org.apache.spark.scheduler.TaskSchedulerImpl.submitTasks(TaskSchedulerImpl.scala:186)

  •     at org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1058)

  •     at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:933)

  •     at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:873)

  •     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1632)

  •     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1624)

  •     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1613)

  •     at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

  • Caused by: java.io.IOException: error=2, No such file or directory

  •     at java.lang.UNIXProcess.forkAndExec(Native Method)

  •     at java.lang.UNIXProcess.(UNIXProcess.java:247)

  •     at java.lang.ProcessImpl.start(ProcessImpl.java:134)

  •     at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)

解决思路:
1.分析这句话,
17/03/02 21:01:56 WARN net.ScriptBasedMapping: Exception running /etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py 172.16.102.64
java.io.IOException: Cannot run program "/etc/spark2/conf.cloudera.spark2_on_yarn/yarn-conf/topology.py" (in directory "/yarn/nm/usercache/spark/appcache/application_1488459089260_0003/container_1488459089260_0003_01_000001"): error=2, No such file or directory
说明在这个ip的机器上 没有这个py文件。

然后去机器验证,
然后再将01机器的 配置文件 全部copy到另外四台即可。
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn root@hadoop-02:/etc/spark2/
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn root@hadoop-03:/etc/spark2/
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn root@hadoop-04:/etc/spark2/
scp -r /etc/spark2/conf.cloudera.spark2_on_yarn root@hadoop-05:/etc/spark2/

验证:
就ok了

上述就是小编为大家分享的CDH Spark2的spark2-submit的一个No such file or directory问题怎么解决了,如果刚好有类似的疑惑,不妨参照上述分析进行理解。如果想知道更多相关知识,欢迎关注创新互联行业资讯频道。


分享题目:CDHSpark2的spark2-submit的一个Nosuchfileordirectory问题怎么解决
标题来源:http://lszwz.com/article/pcscih.html

其他资讯

售后响应及时

7×24小时客服热线

数据备份

更安全、更高效、更稳定

价格公道精准

项目经理精准报价不弄虚作假

合作无风险

重合同讲信誉,无效全额退款