In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces the gphdfs external table and hive how to use, has a certain reference value, interested friends can refer to, I hope you can learn a lot after reading this article, the following let Xiaobian take you to understand.
Install java1.6 or above on all nodes
Install hadoop on segment and support the following versions.
Hadoop Distribution
Version
Gp_hadoop_ target_version
Pivotal HD
Pivotal HD 3.0, 3.0.1
Gphd-3.0
Pivotal HD 2.0, 2.1
Pivotal HD 1.0
Gphd-2.0
Greenplum HD
Greenplum HD 1.2
Gphd-1.2
Greenplum HD 1.1
Gphd-1.1 (default)
Cloudera
CDH 5.2, 5.3, 5.4.x, 5.5.x
Cdh6
CDH 5.0, 5.1
Cdh5.1
CDH 4.12-CDH 4.7
Cdh5.1
Hortonworks Data Platform
HDP 2.1, 2.2, 2.3
Hdp2
MapR
MapR 4.x
Gpmr-1.2
MapR 1.x, 2.x, 3.x
Gpmr-1.0
Apache Hadoop
2.x
Hadoop2
Configure the gp environment of all nodes
[gpadmin@db10 ~] $echo $JAVA_HOME
/ usr/java/default
[gpadmin@db10 ~] $echo $HADOOP_HOME
/ opt/hadoop
[gpadmin@db10 ~] $hadoop version
Hadoop 2.7.2
[gpadmin@db10 ~] $vim .bashrc
If [- f / etc/bashrc]; then
. / etc/bashrc
Fi
# User specific aliases and functions
Source / opt/greenplum-db/greenplum_path.sh
Export MASTER_DATA_DIRECTORY=/hdd1/master/gpseg-1
Export JAVA_HOME=/usr/java/default
Export HADOOP_HOME=/opt/hadoop
Export HADOOP_VERSION=2.7.2
Export PATH=$PATH:$JAVA_HOME/bin/:$HADOOP_HOME/bin
4. Configure gp parameters
[gpadmin@db9 ~] $gpconfig-c gp_hadoop_home-v "'/ opt/hadoop'"
[gpadmin@db9 ~] $gpconfig-c gp_hadoop_target_version-v "'hadoop2'"
[gpadmin@db9] $gpstop-u
5. Create tables on hive
Create table wdbd_dm.date_test1 (id int,name string)
Insert into wdbd_dm.date_test1 values (1 recording wjian')
Insert into wdbd_dm.date_test1 values (2 recorder wuj')
Insert into wdbd_dm.date_test1 values (3)
View the generated hdfs file on gp and generate 3 files
[gpadmin@db10 ~] $hdfs dfs-ls hdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/
……
-rwxrwxr-x 3 root root 8 2017-05-25 16:02 hdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/part-00000
-rwxrwxr-x 3 root root 6 2017-05-25 16:02 hdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/part-00000_copy_1
-rwxrwxr-x 3 root root 7 2017-05-25 16:02 hdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/part-00000_copy_2
Note: 172.20.5.8 is the namenode address of the target cluster, and 8020 is the port
Create the external table of gphdfs file
Create EXTERNAL table hdfs_test (id int,name varchar (32))
Location ('gphdfs://172.20.5.8:8020/user/hive/warehouse/wdbd_dm.db/date_test1/part*')
Format 'TEXT' (DELIMITER'\ 001')
Note: the default delimiter for hive files is the control character\ 001 of the ascii code.
Edw=# select * from hdfs_test
Id | name
-- +-
2 | wuj
3 | mike
1 | wjian
(3 rows)
Thank you for reading this article carefully. I hope the article "how to use gphdfs external form and hive" shared by the editor will be helpful to everyone. At the same time, I also hope you will support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.