In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Environmental requirements:
Mysql
Hadoop
The hive version is: apache-hive-1.2.1-bin.tar
1. Set up hive user
Go to the mysql command line, create the hive user, and grant all permissions:
Mysql-uroot-proot
Mysql > create user 'hive' identified by' hive'
Mysql > grant all on *. * TO 'hive'@'%' with grant option
Mysql > flush privileges
two。 Create a hive database
Log in with the hive user to create the hive database:
Mysql-uhive-phive
Mysql > create database hive
Mysql > show databases
3. Install Hive
Download the hive installation file
You can go to the official website of Apache Fund hive, http://hive.apache.org/downloads.html, and choose the image download address: http://mirrors.cnnic.cn/apache/hive/ to download a stable version.
Put the mysql driver in the lib directory of hive
Download the hive installation package and the mysql driver package, and use the
Cd / home/hadoop/Downloads
Cp mysql-connector-java-5.1.34-bin.jar / usr/local/hive-0.12.0/lib
4. Configure / etc/profile environment variable
Open the / etc/profile file using the following command:
Sudo vi / etc/profile
# set hive path
Export HIVE_HOME=/usr/local/hiveexport PATH=$PATH:$HIVE_HOME/binexport CLASSPATH=$CLASSPATH:$HIVE_HOME/bin
To make the configuration file effective:
Source / etc/profile
5. Set up the hive-env.sh profile
Enter the hive-0.12.0/conf directory and copy hive-env.sh.templaete to hive-env.sh:
Cd / usr/local/hive-0.12.0/conf
Cp hive-env.sh.template hive-env.sh
Ls
Vi hive-env.sh
Set the values of HADOOP_HOME and HIVE_CONF_DIR respectively:
6. Set up the hive-site.xml profile
Copy hive-default.xml.templaete to hive-site.xml
Cp hive-default.xml.template hive-site.xml
Sudo vi hive-site.xml
(1) add configuration item
The default metastore is local. This step can be ignored if the configuration is changed to non-local.
Hive.metastore.local false
(2) modify configuration items
Hive defaults to derby database, and relevant information needs to be adjusted to mysql database.
Hive.metastore.uris thrift://hadoop1:9083 Thrift URI for the remote metastore. ... Javax.jdo.option.ConnectionURL jdbc:mysql://hadoop1:3306/hive?=createDatabaseIfNotExist=true JDBC connect string for a JDBC metastore javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver Driver class name for a JDBC metastore javax.jdo.option.ConnectionUserName hive username to use against metastore database javax.jdo.option.ConnectionPassword hive password to use against metastore database
(3) Correction of errors
About line 2000 of the configuration file, the configuration item hive.server2.thrift.sasl.qop was originally auth, which was modified as follows:
Hive.server2.thrift.sasl.qop auth create table test (a string, b int)
Hive > show tables
Hive > desc test
Unable to start normally, the processing method is:
Http://blog.csdn.net/olfisher/article/details/48730229
Terminal initialization failed The reason for falling back to unsupported is that there is an old version of jline:/hadoop-2.5.2/share/hadoop/yarn/lib:-rw-r--r-- 1 root root 87325 Mar 10 18:10 in the hadoop directory. The jline-0.9.94.jar solution is to copy the JAR package of the new version of jline under hive to hadoop: cp / hive/apache-hive-1.1.0-bin/lib/jline-2.12.jar. / / hadoop-2.5 .2 / share/hadoop/yarn/lib:-rw-r--r-- 1 root root 87325 Mar 10 18:10 jline-0.9.94.jar.bak-rw-r--r-- 1 root root 213854 Mar 11 22:22 jline-2.12.jar
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.