不能写入文件到HDFS - 收到错误HDFS在安全模式下
当我尝试将文件从我local directory
复制到HDFS
我收到以下错误:不能写入文件到HDFS - 收到错误HDFS在安全模式下
[[email protected] ~]$ hadoop fs -copyFromLocal hello.txt /user/cloudera/my_data
copyFromLocal: Cannot create file/user/cloudera/my_data/hello.txt._COPYING_. Name node is in safe mode.
然后我执行的命令:
[[email protected] ~]$ su
Password:
[[email protected] cloudera]# hdfs dfsadmin -safemode leave
safemode: Access denied for user root. Superuser privilege is required
并进一步执行该命令将文件存储到HDFS
我得到了同样的错误。
我再次执行该命令:
[[email protected] ~]$ su - root
Password:
[[email protected] ~]# hdfs dfsadmin -safemode leave
我收到了同样的错误。我使用cloudera
分布hadoop
。
Namenode在重启后有时会处于安全模式,如果等待一段时间(取决于块的数量),namenode会自动离开安全模式。
您可以使用hdfs dfsadmin -safemode leave
命令强制执行此操作,只有HDFS管理员用户可以执行此命令,因此在执行此命令之前切换到hdfs用户。
su hdfs
从Apache文档here
During start up the NameNode loads the file system state from the fsimage and the edits log file. It then waits for DataNodes to report their blocks so that it does not prematurely start replicating the blocks though enough replicas already exist in the cluster. During this time NameNode stays in Safemode. Safemode for the NameNode is essentially a read-only mode for the HDFS cluster, where it does not allow any modifications to file system or blocks. Normally the NameNode leaves Safemode automatically after the DataNodes have reported that most file system blocks are available. If required, HDFS could be placed in Safemode explicitly using bin/hadoop dfsadmin -safemode command.
在大多数情况下,在过程中完成HDFS之后的合理时间是s tarted。但是,您可以强制HDFS通过以下命令出来的安全模式:
hadoop dfsadmin -safemode leave
强烈建议运行fsck从不一致的状态中恢复过来。
感谢Maximillian纠正格式。 – user1574688