Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Detailed installation steps for Hive

2025-03-31 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces "Hive's detailed installation steps". In daily operation, I believe many people have doubts about Hive's detailed installation steps. Xiaobian consulted all kinds of information and sorted out simple and easy operation methods. I hope to help you answer your doubts about "Hive's detailed installation steps"! Next, please follow the small series to learn together!

1. Unzip the file [root@hadoop0 opt]# tar -zxvf hive-0.9.0.tar.gz

2. Change name [root@hadoop0 opt]# mv hive-0.9.0 hive

3. Configure environment variables, modify etc/profile global variable file/opt/hive/bin

JAVA_HOME=/opt/jdk1.6.0_24

HADOOP_HOME=/opt/hadoop

HBASE_HOME=/opt/hbase

HIVE_HOME=/opt/hive

PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$HBASE_HOME/bin:$HIVE_HOME/bin:$PATH

export JAVA_HOME HADOOP_HOME HBASE_HOME HIVE_HOME PATH

[root@hadoop0 bin]# su -

4, Test run, See if the installation is successful [root@hadoop0 ~]# hive

WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.

Logging initialized using configuration in jar:file:/opt/hive/lib/hive-common-0.9.0.jar!/ hive-log4j.properties

Hive history file=/tmp/root/hive_job_log_root_201509250619_148272494.txt

hive> show tables;

FAILED: Error in metadata: MetaException(message:Got exception: java.net.ConnectException Call to hadoop0/192.168.46.129:9000 failed on connection exception: java.net.ConnectException: Connection refused)

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

--Solution: Hive relies on HDFS to store data, so make sure hadoop is started

[root@hadoop0 ~]# start-all.sh

Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /opt/hadoop/libexec/../ logs/hadoop-root-namenode-hadoop0.out

localhost: starting datanode, logging to /opt/hadoop/libexec/../ logs/hadoop-root-datanode-hadoop0.out

localhost: starting secondarynamenode, logging to /opt/hadoop/libexec/../ logs/hadoop-root-secondarynamenode-hadoop0.out

starting jobtracker, logging to /opt/hadoop/libexec/../ logs/hadoop-root-jobtracker-hadoop0.out

localhost: starting tasktracker, logging to /opt/hadoop/libexec/../ logs/hadoop-root-tasktracker-hadoop0.out

--So far the simplest hive environment configuration is complete

5. Start creating data tables hive> show tables;

OK

Time taken: 5.619 seconds

hive> create table stu(name String,age int);

FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException

org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hive/warehouse/stu.

Name node is in safe mode.

The reported blocks 18 has reached the threshold 0.9990 of total blocks 17. Safe mode will be turned off automatically in 15 seconds.

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2204)

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2178)

at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:857)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

at java.lang.reflect.Method.invoke(Method.java:597)

at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:396)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)

at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

)

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

--Solution: Due to lack of parameter configuration, manually create a directory to solve this problem

[root@hadoop0 ~]# mkdir -p /user/hive/warehouse/stu

hive> create table stu(name String,age int);

OK

Time taken: 0.229 seconds

Hive does not support Insert statements hive> insert into stu values ('Meng ', 24);

FAILED: Parse Error: line 1:12 mismatched input 'stu' expecting TABLE near 'into' in insert clause

hive> show tables;

OK

stu

Time taken: 0.078 seconds

hive> desc stu;

OK

name string

age int

Time taken: 0.255 seconds

--Solution: hive does not support the above operations, you can use load to load

hive> LOAD DATA LOCAL INPATH '/opt/stu.txt' OVERWRITE INTO TABLE stu;

Copying data from file:/opt/stu.txt

Copying file: file:/opt/stu.txt

Loading data to table default.stu

Deleted hdfs://hadoop0:9000/user/hive/warehouse/stu

OK

Time taken: 0.643 seconds

Hive> select name ,age from stu;

Total MapReduce jobs = 1

Launching Job 1 out of 1

Number of reduce tasks is set to 0 since there's no reduce operator

Starting Job = job_201509250620_0001, Tracking URL = http://hadoop0:50030/jobdetails.jsp? jobid=job_201509250620_0001

Kill Command = /opt/hadoop/libexec/../ bin/hadoop job -Dmapred.job.tracker=hadoop0:9001 -kill job_201509250620_0001

Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0

2015-09-25 06:37:55,535 Stage-1 map = 0%, reduce = 0%

2015-09-25 06:37:58,565 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 0.59 sec

2015-09-25 06:37:59,595 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 0.59 sec

2015-09-25 06:38:00,647 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 0.59 sec

MapReduce Total cumulative CPU time: 590 msec

Ended Job = job_201509250620_0001

MapReduce Jobs Launched:

Job 0: Map: 1 Cumulative CPU: 0.59 sec HDFS Read: 221 HDFS Write: 22 SUCCESS

Total MapReduce CPU Time Spent: 590 msec

OK

--Query structure is displayed

JieJie 26 NULL

MM 24 NULL

Time taken: 12.812 seconds

Question: Why is there a null value, wait for the next study

At this point, the study of "Hive's detailed installation steps" is over, hoping to solve everyone's doubts. Theory and practice can better match to help everyone learn, go and try it! If you want to continue learning more relevant knowledge, please continue to pay attention to the website, Xiaobian will continue to strive to bring more practical articles for everyone!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report