In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-23 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article will explain in detail about the Sqoop architecture and what the commonly used commands are. The editor thinks it is very practical, so I share it for you as a reference. I hope you can get something after reading this article.
Introduction to Sqoop
Sqoop is a tool used to transfer data between Hadoop and relational database. Data from a relational database (such as MySQL, Oracle, etc.) can be imported into Hadoop's HDFS, or HDFS data can be imported into a relational database.
Sqoop architecture diagram:
Introduction to common commands
MySQL= > Hive (MySQL data is imported into Hive)
1. Migrates the specified mysql table to Hive
Sqoop import-- hive-import-- connect jdbc:mysql://192.168.1.1/dbname-- table ${tablename}-- username ${username}-- password ${password}-- hive-database$ {dbname}-M1-as-parquetfile
For example:
Sqoop import-hive-import-connect jdbc:mysql://172.16.16.15/test-table person-username mdba-password dsf0723-hive-database test-M1-as-parquetfile
two。 Migrate all tables in the mysql specified library to Hive
Sqoop import-all-tables-- hive-import-- connect jdbc:mysql://192.168.1.1/dbname-- username ${username}-- password ${password}-- hive-database ${dbname}-M1-hive-database anhui-- as-parquetfile
-Oracle = > Hbase change jdbc:mysql: in the above command to jdbc:oracle:thin
For example:
Sqoop import-hive-import-connect jdbc:oracle:thin:@172.16.16.16:1523:orcl-table test-username cq2017-password cq2017-hive-database chongqing_2017-hive-table test_20170505-M1-as-parquetfile
From Hive (HDFS) = > MySQL (import from Hive to MySQL)
Sqoop export-connect jdbc:mysql://192.168.1.1:3306/dbname-username root
-- password 123-- export-dir 'hive table hdfs file storage path'-- table mysqltablename-M1-- fields-termianted-by'\ t'
-- MysqlTableName must be created in advance, and the data fields, field types, and delimiters must be set in the same way as in Hive
-- if there is no special delimiter requirement in Hive, the default delimiter is / u0001 without adding a command entry-- fields-termianted-by'\ t'.
-- hive table hdfs file storage path / user/hive/warehouse/ library name / table name
Other commands:
The first category: import the data from the database into HDFS
Sqoop import-connect jdbc:mysql://lishiyu06.10:3306/web_log-username root-password 123
-- table user-- columns'id, name, incoam, expenses' specify the data migration of certain fields in the mysql database table
The column names here are strictly case-sensitive
Specify output path, specify data delimiter
Sqoop import-connect jdbc:mysql://lishiyu06:3306/web_log-username root-password 123
-- table user-- target-dir'/ sqoop/td'-- fields-terminated-by'\ t
Specify the number of Map-m
Sqoop import-connect jdbc:mysql://lishiyu06.10:3306/web_log-username root-password 123
-- table user-- target-dir'/ sqoop/td1'-- fields-terminated-by'\ t'- m2
Add where condition, note: condition must be enclosed in quotation marks = "for additional import
Sqoop import-connect jdbc:mysql://lishiyu06.10:3306/web_log-username root-password 123
-- table user-- where'id > 3'--target-dir'/ sqoop/td2'
Add query statements (use\ to wrap statements)
Sqoop import-connect jdbc:mysql://lishiyu06.10:3306/web_log-username root-password 123\
-- query 'SELECT * FROM user where id > 2 AND $CONDITIONS'-- split-by user.id-- target-dir' / sqoop/td3'
Note: if you use the command-- query, you need to pay attention to the parameter after where, and the parameter AND $CONDITIONS must be added.
And there is a difference between single quotation marks and double quotation marks. If-- query is followed by double quotation marks, you need to put\ that is\ $CONDITIONS before $CONDITIONS.
If you set the number of map to 1, namely-m 1, you do not need to add-- split-by ${tablename.column}
This is the end of this article on "what is the Sqoop architecture and common commands". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, please share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.