在AWS EMR系统上提交我的应用程序时,我遇到了以下错误。在客户端模式下提交spark应用程序正常。如果为了在aws emr中以集群模式工作,需要完成其他任何配置,请告诉我。
[hadoop@ip-172-31-81-182 ~]$ spark-submit --master yarn --deploy-mode cluster --executor-memory 1G --num-executors 1 --driver-memory 1g --executor-cores 1 --conf spark.yarn.submit.waitAppCompletion=false --class WordCount.word.App /home/hadoop/word.jar s3n://bucket1/text.txt s3n://bucket1/output/ s3n://bucket1/analysis1/user.parquet
18/12/13 11:26:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/12/13 11:26:07 INFO RMProxy: Connecting to ResourceManager at ip-172-31-81-182.ec2.internal/172.31.81.182:8032
18/12/13 11:26:07 INFO Client: Requesting a new application from cluster with 1 NodeManagers
18/12/13 11:26:07 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (6144 MB per container)
18/12/13 11:26:07 INFO Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
18/12/13 11:26:07 INFO Client: Setting up container launch context for our AM
18/12/13 11:26:07 INFO Client: Setting up the launch environment for our AM container
18/12/13 11:26:07 INFO Client: Preparing resources for our AM container
18/12/13 11:26:09 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
18/12/13 11:26:11 INFO Client: Uploading resource file:/mnt/tmp/spark-e01877e4-eb8c-4f6a-a6f5-c5c769c9c21e/__spark_libs__2272513134347036396.zip -> hdfs://ip-172-31-81-182.ec2.internal:8020/user/hadoop/.sparkStaging/application_1544697633631_0011/__spark_libs__2272513134347036396.zip
18/12/13 11:26:13 INFO Client: Uploading resource file:/home/hadoop/word.jar -> hdfs://ip-172-31-81-182.ec2.internal:8020/user/hadoop/.sparkStaging/application_1544697633631_0011/word.jar
18/12/13 11:26:15 INFO Client: Uploading resource file:/mnt/tmp/spark-e01877e4-eb8c-4f6a-a6f5-c5c769c9c21e/__spark_conf__8515846431603225843.zip -> hdfs://ip-172-31-81-182.ec2.internal:8020/user/hadoop/.sparkStaging/application_1544697633631_0011/__spark_conf__.zip
18/12/13 11:26:15 INFO SecurityManager: Changing view acls to: hadoop
18/12/13 11:26:15 INFO SecurityManager: Changing modify acls to: hadoop
18/12/13 11:26:15 INFO SecurityManager: Changing view acls groups to:
18/12/13 11:26:15 INFO SecurityManager: Changing modify acls groups to:
18/12/13 11:26:15 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
18/12/13 11:26:15 INFO Client: Submitting application application_1544697633631_0011 to ResourceManager
18/12/13 11:26:16 INFO YarnClientImpl: Submitted application application_1544697633631_0011
18/12/13 11:26:16 INFO Client: Application report for application_1544697633631_0011 (state: ACCEPTED)
18/12/13 11:26:16 INFO Client:
client token: N/A
diagnostics: [Thu Dec 13 11:26:16 +0000 2018] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:6144, vCores:4> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ;
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1544700376013
final status: UNDEFINED
tracking URL: http://ip-172-31-81-182.ec2.internal:20888/proxy/application_1544697633631_0011/
user: hadoop
18/12/13 11:26:16 INFO ShutdownHookManager: Shutdown hook called
18/12/13 11:26:16 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-e01877e4-eb8c-4f6a-a6f5-c5c769c9c21e
18/12/13 11:26:16 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-0762aaf6-577a-4ad7-a4a1-c4c16a590feb
[hadoop@ip-172-31-81-182 ~]$
要在yarn群集模式下执行spark作业时检查实际日志消息,请使用以下命令:
yarn logs -applicationId
对于上面的运行将是application_1544697633631_0011。
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。