Linux安装Hive

准备工作

安装好Mysql

1、下载hive包

下载地址http://mirror.bit.edu.cn/apache/hive/

2、解压到安装目录

tar -xzvf 

3、配置环境变量

[[email protected] ~]$ vi .bash_profile 

增加以下内容

#hive
export HIVE_HOME=/home/ocetl/app/hive
export PATH=${HIVE_HOME}/bin:${HIVE_HOME}/sbin:$PATH

4、创建mysql 数据库并为其创建用户

mysql -uroot -p
# 切换数据库并创建用户
use mysql;
create user 'hive'@'%' identified by 'hive';
# 赋权并刷新权限
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%' identified by 'hive' WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO 'hive'@localhost identified by 'hive' WITH GRANT OPTION;
flush privileges; 
# 用创建的hive用户登录mysql并创建hive库 
mysql -uhive -phive
create database if not exists hive;
alter database hive character set latin1;
# hive数据库编码必须指定为latin1,否则会报下列错误
# Specified key was too long; max key length is 767 bytes

5、hive-site.xml配置

下列属性不一致的需修改
<!--在hdfs上hive数据存放目录,启动hadoop后需要在hdfs上手动创建 -->
<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/hive/data</value>
</property>
<!--通过jdbc协议连接mysql的hive库 -->
<property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://192.168.*.*:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<!--jdbc的mysql驱动 -->
<property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
</property>
<!--mysql用户名 -->
<property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>hive1</value>
</property>
<!--mysql用户密码 -->
<property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hive1</value>
</property>
<!-- zk节点 -->
<property>
    <name>hive.zookeeper.quorum</name>
    <value>einvoice243,einvoice244,einvoice247</value>
</property>
<!-- zk端口 -->
<property>
    <name>hive.zookeeper.client.port</name>
    <value>21810</value>
</property>
<!--hive的web页面--> 
<property>  
    <name>hive.hwi.war.file</name> 
    <value>lib/hive-hwi-0.13.1.war</value> 
</property>
<!-- hwi访问端口 -->
<property>
    <name>hive.hwi.listen.port</name>
    <value>29999</value>
</property>

6、hive-env.sh配置

export HADOOP_HEAPSIZE=1024
export JAVA_HOME=/opt/java/jdk1.8.0_73
export HADOOP_HOME=/home/ocetl/app/hadoop-2.6.0-cdh5.9.3
export HIVE_HOME=/home/ocetl/app/hive
export HIVE_CONF_DIR=/home/ocetl/app/hive/conf
export HIVE_AUX_JARS_PATH=/home/ocetl/app/hive/lib

7、hive-log4j.properties配置

vi hive-log4j.properties 
# 日志路径修改
hive.log.dir=/home/ocetl/hive-0.13.1-cdh5.2.1/logs
###############################################################
原值为:log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter 
修改为:log4j.appender.EventCounter=org.apache.hadoop.log.metrics.EventCounter 
############################################################### 

否则会有警告:
WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.

8、hive-hwi-0.13.1.war获取

# 从源码中提取jsp文件并打包成war文件到hive/lib目录中
# 先检查是否已经存在【ll ${HIVE_HOME}/lib/*.war】
tar xvf hive-0.13.1-cdh5.2.1-src.tar.gz
cd hive-0.13.1-cdh5.2.1-src/hwi
jar cfM hive-hwi-0.13.1.war -C web .
cp hive-hwi-0.13.1.war ${HIVE_HOME}/lib/

9、lib管理

# 拷贝hbase-client和hbase-common包至hive目录【ll ${HIVE_HOME}/lib/hbase*.jar】
cp ${HBASE_HOME}/lib/hbase-client-0.98.6.1.jar ${HIVE_HOME}/lib/
cp ${HBASE_HOME}/lib/hbase-common-0.98.6.1.jar ${HIVE_HOME}/lib/
# 同步hive和hadoop的jline版本
find ${HADOOP_HOME}/share/hadoop/yarn/lib/ -name "*jline*jar" 
find ${HIVE_HOME}/lib/ -name "*jline*jar" 
# 将高版本拷贝到另一个目录,同时删除低版本[存在不一致则执行]
示例:
cp ${HIVE_HOME}/lib/jline-2.12.jar ${HIVE_HOME}/share/hadoop/yarn/lib/
rm ${HIVE_HOME}/share/hadoop/yarn/lib/jline-2.11.jar
# 添加驱动包,jasper是为了解决hwi界面访问时500错误
cp mysql-connector-java-5.1.25.jar ${HIVE_HOME}/lib/
cp jasper-compiler.jar ${HIVE_HOME}/lib/
cp jasper-runtime.jar ${HIVE_HOME}/lib/
# 解决Perhaps JAVA_HOME does not point to the JDK.
cp ${JAVA_HOME}/lib/tools.jar ${HIVE_HOME}/lib/

10、初始化元数据

[[email protected] bin]$ ./schematool -dbType mysql -initSchema

初始化成功

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/ocetl/app/hbase-1.2.0-cdh5.9.3/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/ocetl/app/hadoop-2.6.0-cdh5.9.3/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Metastore connection URL:    jdbc:mysql://192.168.*.*:3306/hive?createDatabaseIfNotExist=true
Metastore Connection Driver :    com.mysql.jdbc.Driver
Metastore connection User:   hive1
Starting metastore schema initialization to 1.1.0
Initialization script hive-schema-1.1.0.mysql.sql
Initialization script completed
schemaTool completed

此时在mysql中的hive库中会看到在初始化过程中创建的表


Linux安装Hive

11、Hive启动

方式1:hive -f  /root/shell/hive-script.sql(适合多条语句)
方式2:hive -e 'sql语句'(适合短语句) 
方式3:hive(直接使用hive交互式模式)
Hive要与HBase整合,则在进入Hive的时候就不能只是简单的运行命令./hive,而应当运行如下语句:
【hive-site如果配置了就直接使用hive】
hive -hiveconf  hbase.zookeeper.quorum=zookeeper1,zookeeper2

Hive启动验证

执行show databases;显示默认数据库即执行成功。

[[email protected] conf]$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/ocetl/app/hbase-1.2.0-cdh5.9.3/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/ocetl/app/hadoop-2.6.0-cdh5.9.3/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Logging initialized using configuration in file:/home/ocetl/app/hive/conf/hive-log4j.properties
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive> show databases;
OK
default
Time taken: 0.236 seconds, Fetched: 1 row(s)

启动hwi页面

[[email protected] bin]$ hive --service hwi > ${HIVE_HOME}/logs/hwi.log 2>&1 &
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/ocetl/app/hbase-1.2.0-cdh5.9.3/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/ocetl/app/hadoop-2.6.0-cdh5.9.3/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/07/17 17:30:44 INFO hwi.HWIServer: HWI is starting up
18/07/17 17:30:45 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
18/07/17 17:30:45 INFO mortbay.log: jetty-6.1.26.cloudera.4
18/07/17 17:30:45 INFO mortbay.log: Extract /home/ocetl/app/hive/lib/hive-hwi-0.13.1.war to /tmp/Jetty_0_0_0_0_9999_hive.hwi.0.13.1.war__hwi__.xvnhjk/webapp
18/07/17 17:30:45 INFO mortbay.log: Started [email protected]:9999

此时打开浏览器访问9999端口即可访问hwi页面


Linux安装Hive