In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly introduces "the installation of Hive database". In the daily operation, I believe that many people have doubts about the installation of Hive database. The editor consulted all kinds of materials and sorted out simple and easy-to-use methods of operation. I hope it will be helpful to answer the doubts of "installation of Hive database". Next, please follow the editor to study!
= I. installation of Hive database =
1. First, you need to install the above hadoop environment.
two。 Install the mysql environment to store Hive's metadata, because the default metadata is stored in the derby (only one link is supported for testing) mysql for the actual environment.
3. The installation environment uses centos 6.5.The IP is 192.168.0.12
= two. Install mysql database store Hive metadata =
Yum install mysql-server
Mysql-uroot-p
Create database hive
Update mysql.user set password=PASSWORD ('root') where User='root'
Flush privileges
= three. Install Hive=
A Java environment is required, and the above Hadoop has been configured.
Cd / data/hadoop
Wget-c http://114.242.101.2:808/hive/apache-hive-2.3.2-bin.tar.gz
Tar xf apache-hive-2.3.2-bin.tar.gz
Mv apache-hive-2.3.2-bin hive
Chown-R hadoop:hadoop hive
Set the Hive environment variable hadoop I have set
Vim / etc/profile
# hive
Export HIVE_HOME=/data/hadoop/hive
Export PATH=$HIVE_HOME/bin:$PATH
Soure / etc/profile
= four. Modify Hive configuration file =
Su-hadoop
Cd / data/hadoop/hive/conf
Mv hive-default.xml.template hive-site.xml
Clear the contents between the files and add the following:
Javax.jdo.option.ConnectionURL
Jdbc:mysql://127.0.0.1:3306/hive?characterEncoding=UTF-8
Javax.jdo.option.ConnectionDriverName
Com.mysql.jdbc.Driver
Javax.jdo.option.ConnectionUserName
Root
Javax.jdo.option.ConnectionPassword
Root
Copy the JDBC driver package of MySQL to the lib directory of Hive
Cd / data/hadoop/hive/lib/
Wget-c http://114.242.101.2:808/hive/mysql-connector-java-5.1.44-bin.jar
= five. Default storage path for Hive on HDFS =
The official website says:
Hive uses Hadoop, so:
You must have Hadoop in your path OR
Export HADOOP_HOME=
In addition, you must use below HDFS commands to create
/ tmp and / user/hive/warehouse (aka hive.metastore.warehouse.dir)
And set them chmod Grouw before you can create a table in Hive.
Su-hadoop
Cd / data/hadoop/hadoop-2.7.4
. / bin/hadoop fs-mkdir / tmp
. / bin/hadoop fs-mkdir-p / user/hive/warehouse
. / bin/hadoop fs-chmod Grouw / tmp
. / bin/hadoop fs-chmod Grouw / user/hive/warehouse
= six. Run Hive=
The following indicates that the operation was successful.
[hadoop@localhost hadoop] $hive
Which: no hbase in (/ data/hadoop/hadoop-2.7.4/bin:/data/hadoop/hive/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/bin:/home/hadoop/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jarVera fileVlGUP dataUniverse HadoopUnix HiveUnix libUnix log4jMurslf4jUPUBINPUR 2.6.2.jarvelyorgUnix slf4jimplexStaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/hadoop/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jarbank]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar bank filebank bank data Async Hadoopqqa HadoopUnix HiveUnix LIBUBUBUR HiveMutel CommonMuray 2.3.2.jarbank bank HiveMuthLog 4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. Spark, tez) or using Hive 1.x releases.
Hive >
>
>
>
= seven. Initialize Hive database =
Su-hadoop
Schematool-initSchema-dbType mysql
= eight. Run the Hive operation command =
Hive > CREATE TABLE pokes (foo INT, bar STRING)
Hive > CREATE TABLE invites (foo INT, bar STRING) PARTITIONED BY (ds STRING)
Hive > SHOW TABLES
Hive > SHOW TABLES'. * s'
Hive > DESCRIBE invites
Hive > ALTER TABLE events RENAME TO 3koobecaf
Hive > ALTER TABLE pokes ADD COLUMNS (new_col INT)
Hive > ALTER TABLE invites ADD COLUMNS (new_col2 INT COMMENT'a comment')
Hive > ALTER TABLE invites REPLACE COLUMNS (foo INT, bar STRING, baz INT COMMENT 'baz replaces new_col2')
Hive > ALTER TABLE invites REPLACE COLUMNS (foo INT COMMENT 'only keep the first column')
Hive > DROP TABLE pokes
At this point, the study on the "installation of Hive database" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.