微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

Hadoop运行start-dfs.sh报ERROR: Attempting to operate on hdfs as root错误的解决方法

错误提示

Starting namenodes on [master]
ERROR: Attempting to operate on hdfs namenode as root
ERROR: but there is no HDFS_NAMENODE_USER defined. Aborting operation.

Starting datanodes
ERROR: Attempting to operate on hdfs datanode as root
ERROR: but there is no HDFS_DatanODE_USER defined. Aborting operation.

Starting secondary namenodes
ERROR: Attempting to operate on hdfs secondarynamenode as root
ERROR: but there is no HDFS_SECONDARYNAMENODE_USER defined. Aborting operation.

Starting journal nodes
ERROR: Attempting to operate on hdfs journalnode as root
ERROR: but there is no HDFS_JOURNALNODE_USER defined. Aborting operation.

Starting ZK Failover Controllers on NN hosts
ERROR: Attempting to operate on hdfs zkfc as root
ERROR: but there is no HDFS_ZKFC_USER defined. Aborting operation.

 

原因:

使用root账号启动服务,但没预先定义

 

解决方法

* 该步骤需要在每台机都执行,也可以先在其中一台机修改,再用scp同步给其它机

1.修改start-dfs.sh和stop-dfs.sh

cd /home/hadoop/sbin
vim start-dfs.sh
vim stop-dfs.sh

在头部添加以下内容

HDFS_ZKFC_USER=root
HDFS_JOURNALNODE_USER=root
HDFS_NAMENODE_USER=root
HDFS_SECONDARYNAMENODE_USER=root
HDFS_DatanODE_USER=root
HDFS_DatanODE_SECURE_USER=root
#HADOOP_SECURE_DN_USER=root

 

2.修改start-yarn.sh和stop-yarn.sh

cd /home/hadoop/sbin
vim start-yarn.sh
vim stop-yarn.sh

在头部添加以下内容

#HADOOP_SECURE_DN_USER=root
HDFS_DatanODE_SECURE_USER=root
YARN_NODEMANAGER_USER=root YARN_RESOURCEMANAGER_USER=root

 

3.同步到其它机

cd /home/hadoop/sbin
scp * c2:/home/hadoop/sbin
scp * c3:/home/hadoop/sbin
scp * c4:/home/hadoop/sbin

 

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。

相关推荐