Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to migrate hadoop data

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly explains "how to migrate hadoop data". Interested friends may wish to have a look at it. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn how to migrate hadoop data.

Instructions for using the tcp tool:

When the table structure / partition is created, execute on any node of the hadoop2.0 cluster:

Hadoop distcp-Dmapreduce.job.queue.name=queue_name-update-skipcrccheck [source_path...]

1. The source_path parameter is the full path of the table / partition data directory to be copied on hadoop1.0, which must be accessed using hftp protocol. For example, the path of test table under test library is h ftp://hadoop1:50070/user/hive/warehouse/test.db/test.

2. Target_path parameter is the path of the corresponding table / partition data directory on hadoop2.0, such as / user/hive/warehouse/test.db/test

3. When copying across versions of a cluster, the parameter-update-skipcrccheck must be included.

4. The-Dmapreduce.job.queue.name parameter is used to specify the resource pool in which the task runs, such as:-Dmapreduce.job.queue.name=queue_0401_01.

5. When a Distcp task is executed, it runs as a MR task, which will occupy resources in the specified resource pool.

Example of copying table data:

Hadoop distcp-Dmapreduce.job.queue.name=queue_0401_01-update-skipcrccheck h ftp://hadoop1:50070/user/hive/warehouse/social.db/sina_wb_timelines / user/hive/warehouse/social.db/sina_wb_timelines

Example of partitioned data copy:

Hadoop distcp-Dmapreduce.job.queue.name=queue_0401_01-update-skipcrccheck h ftp://hadoop1:50070/user/hive/warehouse/social.db/sina_wb_timelines/d=21 / user/hive/warehouse/social.db/sina_wb_timelines/d=21

1. Establish the table structure:

CREATE TABLE `fin_fa_wide_ asset` (

`period_ name` string

`set_of_books_ id` string

`book_type_ code` string

`segment1` string

`segment2` string

`segment3` string

`asset_ id` string

`accountion` string

`asset_category_ id` string

`asset_ number` string

`use_ department` string

`operating_ status` string

`use_ status` string

`use_ people` string

`city` string

`location` string

`units_ assigned` double

`date_placed_in_ service` string

`deprn_run_ date` string

`cost` double

`original_ cost` double

`salvage_ value` double

`recoverable_ cost` double

`current_net_ value` double

`ytd_ deprn` double

`deprn_ reserve` double

`salvage_cost_ rate` double

`deprn_method_ code` string

`deprn_in_ months` double

`life_in_ months` double

`deprn_ amount `double

`deprn_adjustment_ acct` string

`po_ number` string

`asset_invoice_ id` string

`invoice_ number` string)

PARTITIONED BY (

`y`string

`m`string) row format delimited fields terminated by'\ 001' stored as rcfile

2. Derivative:

Hadoop distcp-Dmapreduce.job.queue.name=queue_0009_01-update-skipcrccheck h ftp://hadoop1:50070/user/hive/warehouse/jt_mas_safe.db/fin_fa_wide_asset / user/hive/warehouse/jt_mas_safe.db/fin_fa_wide_asset

3. Load data scripts, depending on different table partitions:

Alter table jt_mas_safe.fin_fa_wide_asset add partition (yawning 2015 memorials 08')

Load data inpath'/ user/hive/warehouse/jt_mas_safe.db/fin_fa_wide_asset/y=2015/m=08' into table jt_mas_safe.fin_fa_wide_asset partition (yawning 2015 memorials 08')

At this point, I believe you have a deeper understanding of "how to migrate hadoop data". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report