在idea中跑spark自带的例子SparkPi报错,请大神帮忙看一下,谢谢。
日志如下,不知道是哪里配置错了
15/11/23 22:29:39 INFO executor.CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT] 15/11/23 22:29:41 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 15/11/23 22:29:41 INFO spark.SecurityManager: Changing view acls to: spark,zcy 15/11/23 22:29:41 INFO spark.SecurityManager: Changing modify acls to: spark,zcy 15/11/23 22:29:41 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark, zcy); users with modify permissions: Set(spark, zcy) 15/11/23 22:29:42 INFO slf4j.Slf4jLogger: Slf4jLogger started 15/11/23 22:29:42 INFO Remoting: Starting remoting 15/11/23 22:29:42 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@SparkWorker203:42203] 15/11/23 22:29:42 INFO util.Utils: Successfully started service 'driverPropsFetcher' on port 42203. 15/11/23 22:29:43 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://sparkDriver@DESKTOP-5HOMAO8:7265]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: 拒绝连接: DESKTOP-5HOMAO8/192.168.40.111:7265 Exception in thread "main" java.lang.reflect.UndeclaredThrowableException at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1643) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59) at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:128) at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:224) at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala) Caused by: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:144) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) ... 4 more
用spark-submit提交试试
应该是本地和spark集群不通吧
<atarget="_blank"rel="nofollow">https://github.com/jacksu/utils4s
已经解决了,sparkdriver的ip映射错了,映射到另一块网卡上了谢谢两位关注<spanstyle="font-size:13.3333px;">请问,你这儿说的<spanstyle="font-family:'MicrosoftYaHei',Verdana,sans-serif,SimSun;font-size:14px;line-height:normal;background-color:#FFFFFF;">sparkdriver的ip映射错了到底是怎么回事,麻烦请告知:601983106@qq.com版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。