Call From master.hadoop/192.168.31.149 to master.hadoop:8020 failed on connection exception

简介: 学习hadoop新手易犯错误:Call From master.hadoop/192.168.31.149 to master.hadoop:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

图片.png
产生此错误的原因是hadoop集群未开启。

开启hadoop集群命令:

start-all.sh 启动所有的Hadoop守护进程。包括NameNode、 Secondary NameNode、DataNode、JobTracker、 TaskTrack
stop-all.sh 停止所有的Hadoop守护进程。包括NameNode、 Secondary NameNode、DataNode、JobTracker、 TaskTrack
start-dfs.sh 启动Hadoop HDFS守护进程NameNode、SecondaryNameNode和DataNode
stop-dfs.sh 停止Hadoop HDFS守护进程NameNode、SecondaryNameNode和DataNode
hadoop-daemons.sh start namenode 单独启动NameNode守护进程
hadoop-daemons.sh stop namenode 单独停止NameNode守护进程
hadoop-daemons.sh start datanode 单独启动DataNode守护进程
hadoop-daemons.sh stop datanode 单独停止DataNode守护进程
hadoop-daemons.sh start secondarynamenode 单独启动SecondaryNameNode守护进程
hadoop-daemons.sh stop secondarynamenode 单独停止SecondaryNameNode守护进程
start-mapred.sh 启动Hadoop MapReduce守护进程JobTracker和TaskTracker
stop-mapred.sh 停止Hadoop MapReduce守护进程JobTracker和TaskTracker
hadoop-daemons.sh start jobtracker 单独启动JobTracker守护进程
hadoop-daemons.sh stop jobtracker 单独停止JobTracker守护进程
hadoop-daemons.sh start tasktracker 单独启动TaskTracker守护进程
hadoop-daemons.sh stop tasktracker 单独启动TaskTracker守护进程

相关文章
|
SQL Linux
Cannot connect to discovery server for announce: Announcement failed for http://hadoop102:8881
linux下启动Presto报错:Cannot connect to discovery server for announce: Announcement failed for http://hadoop102:8881
Cannot connect to discovery server for announce: Announcement failed for http://hadoop102:8881
|
6月前
|
SQL 分布式计算 资源调度
[已解决]FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. Unable to
[已解决]FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. Unable to
277 0
|
分布式计算 Hadoop 程序员
ERROR util.Shell: Failed to locate the winutils binary in the hadoop binary path
ERROR util.Shell: Failed to locate the winutils binary in the hadoop binary path
|
资源调度
yarn导出日志报错:Exception in thread "main" org.apache.hadoop.yarn.exception.ApplicationNotFoundException
yarn导出日志报错:Exception in thread "main" org.apache.hadoop.yarn.exception.ApplicationNotFoundException
yarn导出日志报错:Exception in thread "main" org.apache.hadoop.yarn.exception.ApplicationNotFoundException
|
资源调度 容器
yarn 导出日志报错Exception in thread “main“ org.apache.hadoop.yarn.exceptions.ApplicationNotFoundException
yarn 导出日志报错Exception in thread “main“ org.apache.hadoop.yarn.exceptions.ApplicationNotFoundException
|
分布式计算 Hadoop
解决IDEA:Failed to locate the winutils binary in the hadoop binary path
解决IDEA:Failed to locate the winutils binary in the hadoop binary path
572 0
解决IDEA:Failed to locate the winutils binary in the hadoop binary path
|
SQL 程序员 HIVE
【hive】Exception: Relative path in absolute URI: hdfs://localhost:9000./hadoop_class/hive_class/data)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: hdfs://localhost:9000./hadoop_class/hive_class/data)
297 0
【hive】Exception: Relative path in absolute URI: hdfs://localhost:9000./hadoop_class/hive_class/data)
|
分布式数据库 Hbase
报错:java.lang.RuntimeException: Failed construction of Regionserver: class org.apache.hadoop.hbase.re
报错:java.lang.RuntimeException: Failed construction of Regionserver: class org.apache.hadoop.hbase.re
294 0
|
分布式计算 Hadoop
Hadoop2.0 datanode启动不成功:All specified directories are failed to load
Hadoop2.0 datanode启动不成功:All specified directories are failed to load
|
分布式计算 Hadoop
Exception in thread "main" org.apache.hadoop.security.AccessControlException: Permission deniedUse
Exception in thread "main" org.apache.hadoop.security.AccessControlException: Permission deniedUse
710 0

相关实验场景

更多