Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to migrate hive data

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces "how to migrate hive data". In daily operation, I believe many people have doubts about how to migrate hive data. The editor consulted all kinds of materials and sorted out simple and easy-to-use methods of operation. I hope it will be helpful for you to answer the doubts about "how to migrate hive data". Next, please follow the editor to study!

Hive data migration. Data from hive of cdh4u5 is migrated to hive of cdh6.1. Because distcp cannot be used, you need to export data manually.

On hadoop4

Cd / tmp/test/people_payment_log

Hadoop fs-get / data/warehouse/userdb.db/people_payment/hour=201309*.

Hadoop fs-get / data/warehouse/userdb.db/people_payment/hour=201310*.

Hadoop fs-get / data/warehouse/userdb.db/people_payment/hour=201311*.

Hadoop fs-get / data/warehouse/userdb.db/people_payment/hour=201312*.

Hadoop fs-get / data/warehouse/userdb.db/people_payment/hour=201401*.

Hadoop fs-get / data/warehouse/userdb.db/people_payment/hour=201402*.

Hadoop fs-get / data/warehouse/userdb.db/people_payment/hour=201403*.

Cd / tmp/test

Tar-czf people_payment_log.tgz people_payment_log

Compression, copy to hdp7,/home/abc/cdh/people_payment, decompression

On hdp7,scp-Cr hadoop4:/tmp/test/people_payment_log.tgz / home/abc/cdh/people_payment

Cd / home/abc/cdh/people_payment;tar-xzf people_payment_log.tgz

Upload the data to the people_ paymenttable of the cdh6 cluster, and the shell content is as follows:

Base_dir=/home/abc/cdh/people_payment

Data_dir=$base_dir/people_payment_log

Ls $data_dir > $base_dir/hour.txt

Cd $data_dir

Cat $base_dir/hour.txt | while read oneHour

Do

Echo $oneHour

Hadoop fs-put $oneHour / user/hive/warehouse/userdb.db/people_payment/

Done

Then you need to let hive metastore know that these partitions exist and generate a partition alert statement.

Base_dir=/home/abc/cdh/people_payment

Cd $base_dir

Echo "use userdb;" > $base_dir/alert.txt

Cat $base_dir/hour.txt | while read oneHour

Do

Realy_hour= `echo $oneHour | awk-F'='{print $2}'`

Echo "ALTER TABLE people_payment ADD PARTITION (hour ='$realy_hour');" > > $base_dir/alert.txt

Done

The content of alert.txt is similar

Use userdb

ALTER TABLE people_payment ADD PARTITION (hour = '2013090100')

ALTER TABLE people_payment ADD PARTITION (hour = '2013090101')

Then call hive-f alert.txt to centralize alert partition.

If a file exists directly, you can import hive in the following way

The script reads as follows:

Base_dir=/home/abc/cdh/people_payment

Data_dir=/data/login/data_login_raw

Hive_db=userdb

Table=user_login

Ls $data_dir/a.bc.d.201408* | awk-F'.'{print $5}'> $base_dir/hour.txt

Cat $base_dir/hour.txt | while read oneHour

Do

Echo $oneHour

Sql= "use $hive_db;LOAD DATA LOCAL INPATH'$data_dir/a.bc.d.$oneHour' OVERWRITE INTO table $table partition (hour=$oneHour);"

Echo "= = $sql"

/ home/abc/cdh/hive/bin/hive-e "$sql"

Done

It is best to generate a batch of LOAD DATA LOCAL INPATH. .. Statement

Then hive-f is called to avoid starting hive client. F multiple times.

At this point, the study on "how to migrate hive data" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report