In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Hive stand-alone mode installation-jared
The deployment note was recorded in early 2014 and is now on 51cto.
For more information on building the basic environment of hadoop, please refer to the following link:
Http://ganlanqing.blog.51cto.com/6967482/1387210
JDK version: jdk-7u51-linux-x64.rpm
Hadoop version: hadoop-0.20.2.tar.gz
Hive version: hive-0.12.0.tar.gz
Mysql driver package version: mysql-connector-java-5.1.7-bin.jar
1. Install the mysql environment
[root@master ~] # yum install mysql mysql-server-y
[root@master ~] # / etc/init.d/mysqld start
[root@master ~] # mysqladmin-uroot password "123456"
[root@master] # mysql-uroot-p123456
Welcome to the MySQL monitor. Commands end with; or\ g.
Your MySQL connection id is 2
Server version: 5.1.73 Source distribution
Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
Affiliates. Other names may be trademarks of their respective
Owners.
Type 'help;' or'\ h' for help. Type'\ c'to clear the current input statement.
Mysql > create user 'hive' identified by' 123456'
Query OK, 0 rows affected (0.00 sec)
Mysql > GRANT ALL PRIVILEGES ON *. * TO 'hive'@'master' IDENTIFIED BY' 123456 'WITH GRANT OPTION
Query OK, 0 rows affected (0.00 sec)
Mysql > flush privileges
Query OK, 0 rows affected (0.00 sec)
Mysql > exit
Bye
[root@master ~] #
#
Missing steps for hive users to create hive libraries!
#
two。 Download the hive installation package
[jared@master conf] $wget http://mirror.bjtu.edu.cn/apache/hive/hive-0.12.0/hive-0.12.0-bin.tar.gz
[jared@master conf] $gzip-d hive-0.12.0.tar.gz
[jared@master conf] $tar-xf hive-0.12.0.tar
[jared@master conf] $mv hive-0.12.0 hive
3. Set environment variabl
[root@master ~] # vim / etc/profile
Export JAVA_HOME=/usr/java/jdk1.7.0_51
Export HIVE_HOME=/home/jared/hive
Export HIVE_CONF_DIR=/home/jared/hive/conf
Export HIVE_LIB=$HIVE_HOME/lib
Export HADOOP_INSTALL=/home/jared/hadoop
Export HBASE_INSTALL=/home/jared/hbase
Export PATH=$PATH:$HADOOP_INSTALL/bin:$HBASE_INSTALL/bin:$HIVE_HOME/bin
[root@master ~] # source / etc/profile
[root@master ~] # exit
Logout
[jared@master conf] $pwd
/ home/jared/hive/conf
[jared@master conf] $source / etc/profile
[jared@master conf] $echo $HIVE_HOME
/ home/jared/hive
[jared@master conf] $cp hive-env.sh.template hive-env.sh
[jared@master conf] $vim hive-env.sh
Export HADOOP_HEAPSIZE=1024
HADOOP_HOME=/home/jared/hadoop
Export HIVE_CONF_DIR=/home/jared/hive/conf
Export HIVE_AUX_JARS_PATH=/home/jared/hive/lib
[jared@master conf] $source hive-env.sh
4. Configure hive-site.xml
[jared@master conf] $vim hive-site.xml
Javax.jdo.option.ConnectionURL
Jdbc:mysql://master:3306/hive?createDatabaseIfNotExist=true
Javax.jdo.option.ConnectionDriverName
Com.mysql.jdbc.Driver
Javax.jdo.option.ConnectionUserName
Hive
Javax.jdo.option.ConnectionPassword
123456
5. Copy the driver package of mysql to the lib directory under the Hive installation path
[jared@master ~] $wget http://cdn.mysql.com/archives/mysql-connector-java-5.1/mysql-connector-java-5.1.7.tar.gz
[jared@master ~] $tar-zxvf mysql-connector-java-5.1.7.tar.gz
[jared@master ~] $cd mysql-connector-java-5.1.7
[jared@master ~] $cp mysql-connector-java-5.1.7-bin.jar / home/jared/hive/lib/
6.CLI access interface: shell
[jared@master ~] $hive
Logging initialized using configuration in jar bank filebank bank, home bank bank, hivebank bank, hivemercommonlue 0.12.0.jarbank bank, hiveMurray log4j.properties,
Hive > show databases
OK
Default
Time taken: 11.506 seconds, Fetched: 1 row (s)
Hive > create table test (key string)
OK
Time taken: 2.805 seconds
Hive > show tables
OK
Test
Time taken: 0.091 seconds, Fetched: 1 row (s)
Hive >
7. Local upload data test
Local file information
File name: access.log
Size: 11m
[jared@master input] $du-h access.log
11M access.log
[jared@master input] $cat access.log | wc-l
60000
Data structure:
[jared@master input] $cat access.log
1393960136.926 0 212.92.231.166 TCP_DENIED/403 1256 GET http://221.181.39.85/phpTest/zologize/axa.php-NONE/- text/html "-"--
1393960137.600 0 212.92.231.166 TCP_DENIED/403 1264 GET http://221.181.39.85/phpMyAdmin/scripts/setup.php-NONE/- text/html "-"--
1393960138.274 0 212.92.231.166 TCP_DENIED/403 1250 GET http://221.181.39.85/pma/scripts/setup.php-NONE/- text/html "-"--
1393960138.946 0 212.92.231.166 TCP_DENIED/403 1258 GET http://221.181.39.85/myadmin/scripts/setup.php-NONE/- text/html "-"--
1393960143.624 1 127.0.0.1 TCP_HIT/200 22874 GET http://www.chinacache.com/p_w_picpaths/logo.gif-NONE/- p_w_picpath/gif "-"--
1393960143.628 1 127.0.0.1 TCP_HIT/200 22874 GET http://www.chinacache.com/p_w_picpaths/logo.gif-NONE/- p_w_picpath/gif "-"--
1393960144.636 2 127.0.0.1 TCP_HIT/200 22874 GET http://www.chinacache.com/p_w_picpaths/logo.gif-NONE/- p_w_picpath/gif "-"--
1393960145.643 2 127.0.0.1 TCP_HIT/200 22874 GET http://www.chinacache.com/p_w_picpaths/logo.gif-NONE/- p_w_picpath/gif "-"--
1393982948.194 1 112.5.4.63 TCP_HIT/200 467 GET http://cu005.www.duba.net/duba/2011/kcomponent/kcom_commonfast/53a08fed.dat-NONE/- text/plain "-"--
1393982948.246 0 218.203.54.25 TCP_HIT/200 462GET http://cu005.www.duba.net/duba/2011/kcomponent/kcom_kvm2/indexkcom_kvm2.dat-NONE/- text/plain "-"--
1393982948.258 0 218.203.54.25 TCP_HIT/200 467 GET http://cu005.www.duba.net/duba/2011/kcomponent/kcom_commonfast/53a08fed.dat-NONE/- text/plain "-
Establish the table structure
Hive > CREATE TABLE CU005_LOG (TIMES_TAMP STRING,RES_TIME INT,FC_IP STRING,FC_HANDLING STRING,FILE_SIZE INT,REQ_METHOD STRING,URL STRING,USER STRING,BACK_SRC STRING,MIME STRING,REFERER STRING,UA STRING,COOKIE STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY''STORED AS TEXTFILE
Hive > show tables
OK
Cu005_log
Time taken: 0.08seconds, Fetched: 1 row (s)
Hive > desc cu005_log
OK
Times_tamp string None
Res_time int None
Fc_ip string None
Fc_handling string None
File_size int None
Req_method string None
Url string None
User string None
Back_src string None
Mime string None
Referer string None
Ua string None
Cookie string None
Time taken: 0.208 seconds, Fetched: 13 row (s)
Hive >
Import local data
Hive > LOAD DATA LOCAL INPATH'/ home/jared/input/access.log' OVERWRITE INTO TABLE CU005_LOG
Copying data from file:/home/jared/input/access.log
Copying file: file:/home/jared/input/access.log
Loading data to table default.cu005_log
Table default.cu005_log stats: [num_partitions: 0, num_files: 1, num_rows: 0, total_size: 10872324, raw_data_size: 0]
OK
Time taken: 1.811 seconds
Storage location in hadoop cluster
Hdfs://master:9000/user/hive/warehouse/cu005_log/access.log
[jared@master ~] $hadoop dfs-ls / user/hive/warehouse/
Found 1 items
Drwxr-xr-x-jared supergroup 0 2014-03-06 18:31 / user/hive/warehouse/cu005_log
Query
Hive > select count (*) from cu005_log
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
Set hive.exec.reducers.bytes.per.reducer=
In order to limit the maximum number of reducers:
Set hive.exec.reducers.max=
In order to set a constant number of reducers:
Set mapred.reduce.tasks=
Starting Job = job_201402230829_0003, Tracking URL = http://master:50030/jobdetails.jsp?jobid=job_201402230829_0003
Kill Command = / home/jared/hadoop/bin/../bin/hadoop job-kill job_201402230829_0003
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
2014-03-06 17 1815 07994 Stage-1 map = 0%, reduce = 0%
2014-03-06 17 18 32121 Stage-1 map = 100%, reduce = 0%
2014-03-06 17 18 44 200 Stage-1 map = 100%, reduce = 33%
2014-03-06 17 18 47220 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201402230829_0003
MapReduce Jobs Launched:
Job 0: Map: 1 Reduce: 1 HDFS Read: 10872324 HDFS Write: 6 SUCCESS
Total MapReduce CPU Time Spent: 0 msec
OK
60000
Time taken: 77.157 seconds, Fetched: 1 row (s)
Hive >
Web interface access
Please refer to https://cwiki.apache.org/confluence/display/Hive/HiveWebInterface for details.
You need to add some configuration items to the $HIVE_HOME/conf/hive_site.xml configuration file, as follows:
Hive.hwi.listen.host
192.168.255.25
This is the host address the Hive Web Interface will listen on
Hive.hwi.listen.port
9999
This is the port the Hive Web Interface will listen on
Hive.hwi.war.file
Lib/hive-hwi-0.12.0.war
This is the WAR file with the jsp content for Hive Web Interface
Start hive hwi
Background start
[jared@master] $nohup hive-- service hwi > / dev/null 2 > / dev/null &
Browsers access http://192.168.255.25:9999/hwi/
Use reference http://www.cnblogs.com/gpcuster/archive/2010/02/25/1673480.html
Comparison between HWI and CLI
If friends who have used cli read the above introduction, they will surely find a serious problem: there is no hint in the execution process. We don't know when a query execution ends.
Summarize the pros and cons of HWI versus CLI:
Advantages: HWI supports browser browsing, convenient and intuitive.
Disadvantages: no hint of execution process.
Personally, I prefer to use cli.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.