一脸懵逼学习Hive的安装

Hive只在一个节点上安装即可

1.上传tar包:这个上传就不贴图了,贴一下上传后的,看一下虚拟机吧:

一脸懵逼学习Hive的安装

2.解压操作:

[[email protected] hadoop]# tar -zxvf hive-0.12.0.tar.gz

一脸懵逼学习Hive的安装

解压后贴一下图:

一脸懵逼学习Hive的安装

 3:解压缩以后启动一下hive:

一脸懵逼学习Hive的安装

 4:开始操作sql:

好吧,开始没有启动集群,输入mysql创建数据库命令,直接不屌我,我也是苦苦等待啊;

5:启动我的集群,如下所示,这里最后帖一遍部署以后集群关了,重新开启集群的步骤,不能按照部署集群的时候进行格式化一些操作,如下所示:

  第一先:启动zookeeper集群(分别在master、slaver1、slaver2上启动zookeeker)

一脸懵逼学习Hive的安装

一脸懵逼学习Hive的安装

一脸懵逼学习Hive的安装

  第二步:启动journalnode(分别在master、slaver1、slaver2上启动):

一脸懵逼学习Hive的安装

一脸懵逼学习Hive的安装

一脸懵逼学习Hive的安装

运行jps命令检验,master、slaver1、slaver2上多了JournalNode进程;

   第三步:启动HDFS(在slaver3上执行):

一脸懵逼学习Hive的安装

一脸懵逼学习Hive的安装

  第四步:启动YARN(#####注意#####:是在weekend03上执行start-yarn.sh,把namenode和resourcemanager分开是因为性能问题,因为他们都要占用大量资源,所以把他们分开了,他们分开了就要分别在不同的机器上启动)

 一脸懵逼学习Hive的安装

一脸懵逼学习Hive的安装

 6:然后操作hive,开始居然还报错了,对于一个小白来说,每一个错都是刻骨铭心啊,下面贴一下错误,也许能帮助到他人;

一脸懵逼学习Hive的安装

错误如下所示:

hive> create database user;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hive/warehouse/user.db. Name node is in safe mode.
The reported blocks 0 needs additional 27 blocks to reach the threshold 0.9990 of total blocks 27.
The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1211)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:3590)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3566)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:754)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:558)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
Caused by: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hive/warehouse/user.db. Name node is in safe mode.
The reported blocks 0 needs additional 27 blocks to reach the threshold 0.9990 of total blocks 27.
The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1207)
    ... 13 more
)

然后百度了一下,大眼一瞄,可能是防火墙的原因,先关防火墙,先从这种解决问题的方向为入口,不然都是大问题了,然后七台机器的防火墙都关了:

 一脸懵逼学习Hive的安装

然后就可以了,具体的HIVE学习,待慢慢深学,至少现在入门了;