In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly introduces what problems sqoop will encounter when it is guided from Hive to mysql. It has a certain reference value. Interested friends can refer to it. I hope you will gain a lot after reading this article. Let the editor take you to know it.
Environment
Hive version hive-0.11.0 sqoop version sqoop-1.4.4.bin__hadoop-1.0.0 is imported from Hive to the mysql mysql table:
Mysql > desc cps_activation
+-+-- +
| | Field | Type | Null | Key | Default | Extra | |
+-+-- +
| | id | int (11) | NO | PRI | NULL | auto_increment |
| | day | date | NO | MUL | NULL |
| | pkgname | varchar (50) | YES | | NULL |
| | cid | varchar (50) | YES | | NULL |
| | pid | varchar (50) | YES | | NULL |
| | activation | int (11) | YES | | NULL |
+-+-- +
6 rows in set (0.01 sec)
Hive table
Hive > desc active
OK
Id int None
Day string None
Pkgname string None
Cid string None
Pid string None
Activation int None
Test link successful
[hadoop@hs11] sqoop list-databases-connect jdbc:mysql://localhost:3306/-username root-password admin
Warning: / usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13-08-20 16:42:26 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
13-08-20 16:42:26 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
Information_schema
Easyhadoop
Mysql
Test
[hadoop@hs11 ~] $sqoop list-databases-connect jdbc:mysql://localhost:3306/test-username root-password admin
Warning: / usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13-08-20 16:42:40 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
13-08-20 16:42:40 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
Information_schema
Easyhadoop
Mysql
Test
[hadoop@hs11 ~] $sqoop list-tables-connect jdbc:mysql://localhost:3306/test-username root-password admin
Warning: / usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13-08-20 16:42:54 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
13-08-20 16:42:54 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
Active
[hadoop@hs11] $sqoop create-hive-table-connect jdbc:mysql://localhost:3306/test-table active-username root-password admin-hive-table test
Warning: / usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13-08-20 16:57:04 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
13-08-20 16:57:04 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16:57:04 on 13-08-20 INFO tool.BaseSqoopTool: delimiters with-fields-terminated-by, etc.
13-08-20 16:57:04 WARN tool.BaseSqoopTool: It seems that you've specified at least one of following:
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool:-hive-home
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool:-hive-overwrite
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool:-create-hive-table
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool:-hive-table
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool:-hive-partition-key
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool:-hive-partition-value
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool:-map-column-hive
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool: Without specifying parameter-hive-import. Please note that
13-08-20 16:57:04 WARN tool.BaseSqoopTool: those arguments will not be used in this session. Either
16:57:04 on 13-08-20 WARN tool.BaseSqoopTool: specify-hive-import to apply them correctly or remove them
13-08-20 16:57:04 WARN tool.BaseSqoopTool: from command line to remove this warning.
16:57:04 on 13-08-20 INFO tool.BaseSqoopTool: Please note that-hive-home,-hive-partition-key
16:57:04 on 13-08-20 INFO tool.BaseSqoopTool: hive-partition-value and-map-column-hive options are
13-08-20 16:57:04 INFO tool.BaseSqoopTool: are also valid for HCatalog imports and exports
13-08-20 16:57:04 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16:57:05 on 13-08-20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `active` AS t LIMIT 1
16:57:05 on 13-08-20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `active` AS t LIMIT 1
13-08-20 16:57:05 WARN hive.TableDefWriter: Column day had to be cast to a less precise type in Hive
13-08-20 16:57:05 INFO hive.HiveImport: Loading uploaded data into Hive
1. Reject the connection
[hadoop@hs11 ~] $sqoop export- connect jdbc:mysql://localhost/test-username root-password admin-table test-export-dir / user/hive/warehouse/actmp
Warning: / usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13-08-21 09:14:07 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
13-08-21 09:14:07 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13-08-21 09:14:07 INFO tool.CodeGenTool: Beginning code generation
09:14:07 on 13-08-21 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
09:14:07 on 13-08-21 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
13-08-21 09:14:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is / home/hadoop/hadoop-1.1.2
Note: / tmp/sqoop-hadoop/compile/0b5cae714a00b3940fb793c3694408ac/test.java uses or overrides a deprecated API.
Note: Recompile with-Xlint:deprecation for details.
13-08-21 09:14:08 INFO orm.CompilationManager: Writing jar file: / tmp/sqoop-hadoop/compile/0b5cae714a00b3940fb793c3694408ac/test.jar
13-08-21 09:14:08 INFO mapreduce.ExportJobBase: Beginning export of test
13-08-21 09:14:09 INFO input.FileInputFormat: Total input paths to process: 1
13-08-21 09:14:09 INFO input.FileInputFormat: Total input paths to process: 1
13-08-21 09:14:09 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13-08-21 09:14:09 WARN snappy.LoadSnappy: Snappy native library not loaded
13-08-21 09:14:10 INFO mapred.JobClient: Running job: job_201307251523_0059
13-08-21 09:14:11 INFO mapred.JobClient: map 0 reduce 0
13-08-21 09:14:20 INFO mapred.JobClient: Task Id: attempt_201307251523_0059_m_000000_0, Status: FAILED
Java.io.IOException: com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception:
* * BEGIN NESTED EXCEPTION * *
Java.net.ConnectException
MESSAGE: Connection refused
STACKTRACE:
Java.net.ConnectException: Connection refused
At java.net.PlainSocketImpl.socketConnect (Native Method)
At java.net.PlainSocketImpl.doConnect (PlainSocketImpl.java:351)
At java.net.PlainSocketImpl.connectToAddress (PlainSocketImpl.java:213)
At java.net.PlainSocketImpl.connect (PlainSocketImpl.java:200)
At java.net.SocksSocketImpl.connect (SocksSocketImpl.java:366)
At java.net.Socket.connect (Socket.java:529)
At java.net.Socket.connect (Socket.java:478)
At java.net.Socket. (Socket.java:375)
At java.net.Socket. (Socket.java:218)
At com.mysql.jdbc.StandardSocketFactory.connect (StandardSocketFactory.java:256)
At com.mysql.jdbc.MysqlIO. (MysqlIO.java:271)
At com.mysql.jdbc.Connection.createNewIO (Connection.java:2771)
At com.mysql.jdbc.Connection. (Connection.java:1555)
At com.mysql.jdbc.NonRegisteringDriver.connect (NonRegisteringDriver.java:285)
At java.sql.DriverManager.getConnection (DriverManager.java:582)
At java.sql.DriverManager.getConnection (DriverManager.java:185)
At org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection (DBConfiguration.java:294)
At org.apache.sqoop.mapreduce.AsyncSqlRecordWriter. (AsyncSqlRecordWriter.java:76)
At org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter. (ExportOutputFormat.java:95)
At org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter (ExportOutputFormat.java:77)
At org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector. (MapTask.java:628)
At org.apache.hadoop.mapred.MapTask.runNewMapper (MapTask.java:753)
At org.apache.hadoop.mapred.MapTask.run (MapTask.java:370)
At org.apache.hadoop.mapred.Child$4.run (Child.java:255)
At java.security.AccessController.doPrivileged (Native Method)
At javax.security.auth.Subject.doAs (Subject.java:396)
At org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1149)
At org.apache.hadoop.mapred.Child.main (Child.java:249)
* * END NESTED EXCEPTION * *
Last packet sent to the server was 1 ms ago.
At org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter (ExportOutputFormat.java:79)
At org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector. (MapTask.java:628)
At org.apache.hadoop.mapred.MapTask.runNewMapper (MapTask.java:753)
At org.apache.hadoop.mapred.MapTask.run (MapTask.java:370)
At org.apache.hadoop.mapred.Child$4.run (Child.java:255)
At java.security.AccessController.doPrivileged (Native Method)
At javax.security.auth.Subject.doAs (Subject.java:396)
At org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1149)
At org.apache.hadoop.mapred.Child.main (Child.java:249)
Caused by: com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception:
* * BEGIN NESTED EXCEPTION * *
Java.net.ConnectException
MESSAGE: Connection refused
Mysql user rights issues mysql > show grants; mysql > GRANT ALL PRIVILEGES ON *. * TO 'root'@'%' IDENTIFIED BY PASSWORD' * 4ACFE3202A5FF5CF467898FC58AAB1D615029441' WITH GRANT OPTION
Mysql > FLUSH PRIVILEGES
Mysql > create table test (mkey varchar (30), pkg varchar (50), cid varchar (20), pid varchar (50), count int,primary key (mkey,pkg,cid,pid)); alter ignore table cps_activation add unique index_day_pkgname_cid_pid (`day`, `pkgname`, `cid`, `pid`)
Query OK, 0 rows affected (0.03 sec)
two。 Table does not exist
= [hadoop@hs11 ~] $sqoop export- connect jdbc:mysql://10.10.20.11/test-username root-password admin-table test-export-dir / user/hive/warehouse/actmp
Warning: / usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13-08-21 09:16:26 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
13-08-21 09:16:26 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13-08-21 09:16:26 INFO tool.CodeGenTool: Beginning code generation
09:16:27 on 13-08-21 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
09:16:27 on 13-08-21 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
13-08-21 09:16:27 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is / home/hadoop/hadoop-1.1.2
Note: / tmp/sqoop-hadoop/compile/74d18a91ec141f2feb777dc698bf7eb4/test.java uses or overrides a deprecated API.
Note: Recompile with-Xlint:deprecation for details.
13-08-21 09:16:28 INFO orm.CompilationManager: Writing jar file: / tmp/sqoop-hadoop/compile/74d18a91ec141f2feb777dc698bf7eb4/test.jar
13-08-21 09:16:28 INFO mapreduce.ExportJobBase: Beginning export of test
13-08-21 09:16:29 INFO input.FileInputFormat: Total input paths to process: 1
13-08-21 09:16:29 INFO input.FileInputFormat: Total input paths to process: 1
13-08-21 09:16:29 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13-08-21 09:16:29 WARN snappy.LoadSnappy: Snappy native library not loaded
13-08-21 09:16:29 INFO mapred.JobClient: Running job: job_201307251523_0060
13-08-21 09:16:30 INFO mapred.JobClient: map 0 reduce 0
13-08-21 09:16:38 INFO mapred.JobClient: Task Id: attempt_201307251523_0060_m_000000_0, Status: FAILED
Java.io.IOException: Can't export data, please check task tracker logs
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:112)
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:39)
At org.apache.hadoop.mapreduce.Mapper.run (Mapper.java:144)
At org.apache.sqoop.mapreduce.AutoProgressMapper.run (AutoProgressMapper.java:64)
At org.apache.hadoop.mapred.MapTask.runNewMapper (MapTask.java:764)
At org.apache.hadoop.mapred.MapTask.run (MapTask.java:370)
At org.apache.hadoop.mapred.Child$4.run (Child.java:255)
At java.security.AccessController.doPrivileged (Native Method)
At javax.security.auth.Subject.doAs (Subject.java:396)
At org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1149)
At org.apache.hadoop.mapred.Child.main (Child.java:249)
Caused by: java.util.NoSuchElementException
At java.util.AbstractList$Itr.next (AbstractList.java:350)
At test.__loadFromFields (test.java:252)
At test.parse (test.java:201)
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:83)
... 10 more exports data to MySQL, of course, the database table must exist first, otherwise this error will be reported because the fields of the sqoop parsing file do not correspond to the fields of the MySql database. Therefore, you need to add parameters to the sqoop during execution, telling the sqoop file delimiter so that it can parse the file fields correctly. The default field delimiter for hive is'\ 001' =
3. Null field fillers need to be specified
No null field delimiter was specified, resulting in dislocation. [hadoop@hs11 ~] $sqoop export- connect jdbc:mysql://10.10.20.11/test-username root-password admin-table test-export-dir / user/hive/warehouse/actmp-input-fields-tminated-by'\ 001'
Warning: / usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13-08-21 09:21:07 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
13-08-21 09:21:07 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13-08-21 09:21:07 INFO tool.CodeGenTool: Beginning code generation
09:21:07 on 13-08-21 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
09:21:07 on 13-08-21 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
13-08-21 09:21:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is / home/hadoop/hadoop-1.1.2
Note: / tmp/sqoop-hadoop/compile/04d183c9e534cdb8d735e1bdc4be3deb/test.java uses or overrides a deprecated API.
Note: Recompile with-Xlint:deprecation for details.
13-08-21 09:21:08 INFO orm.CompilationManager: Writing jar file: / tmp/sqoop-hadoop/compile/04d183c9e534cdb8d735e1bdc4be3deb/test.jar
13-08-21 09:21:08 INFO mapreduce.ExportJobBase: Beginning export of test
13-08-21 09:21:09 INFO input.FileInputFormat: Total input paths to process: 1
13-08-21 09:21:09 INFO input.FileInputFormat: Total input paths to process: 1
13-08-21 09:21:09 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13-08-21 09:21:09 WARN snappy.LoadSnappy: Snappy native library not loaded
13-08-21 09:21:10 INFO mapred.JobClient: Running job: job_201307251523_0061
13-08-21 09:21:11 INFO mapred.JobClient: map 0 reduce 0
13-08-21 09:21:17 INFO mapred.JobClient: map 25% reduce 0
13-08-21 09:21:19 INFO mapred.JobClient: map 50% reduce 0
13-08-21 09:21:21 INFO mapred.JobClient: Task Id: attempt_201307251523_0061_m_000001_0, Status: FAILED
Java.io.IOException: Can't export data, please check task tracker logs
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:112)
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:39)
At org.apache.hadoop.mapreduce.Mapper.run (Mapper.java:144)
At org.apache.sqoop.mapreduce.AutoProgressMapper.run (AutoProgressMapper.java:64)
At org.apache.hadoop.mapred.MapTask.runNewMapper (MapTask.java:764)
At org.apache.hadoop.mapred.MapTask.run (MapTask.java:370)
At org.apache.hadoop.mapred.Child$4.run (Child.java:255)
At java.security.AccessController.doPrivileged (Native Method)
At javax.security.auth.Subject.doAs (Subject.java:396)
At org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1149)
At org.apache.hadoop.mapred.Child.main (Child.java:249)
Caused by: java.lang.NumberFormatException: For input string: "665A5FFA-32C9-9463-1943-840A5FEAE193"
At java.lang.NumberFormatException.forInputString (NumberFormatException.java:48)
At java.lang.Integer.parseInt (Integer.java:458)
At java.lang.Integer.valueOf (Integer.java:554)
At test.__loadFromFields (test.java:264)
At test.parse (test.java:201)
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:83)
... 10 more =
4. Success
[hadoop@hs11] $sqoop export- connect jdbc:mysql://10.10.20.11/test-username root-password admin-table test-export-dir / user/hive/warehouse/actmp-input-fields-terminated-by'\ 001'-input-null-string'\ N'-input-null-non-string
'\ N'
Warning: / usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13-08-21 09:36:13 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
13-08-21 09:36:13 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13-08-21 09:36:13 INFO tool.CodeGenTool: Beginning code generation
09:36:13 on 13-08-21 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
09:36:13 on 13-08-21 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
13-08-21 09:36:13 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is / home/hadoop/hadoop-1.1.2
Note: / tmp/sqoop-hadoop/compile/e22d31391498b790d799897cde25047d/test.java uses or overrides a deprecated API.
Note: Recompile with-Xlint:deprecation for details.
13-08-21 09:36:14 INFO orm.CompilationManager: Writing jar file: / tmp/sqoop-hadoop/compile/e22d31391498b790d799897cde25047d/test.jar
13-08-21 09:36:14 INFO mapreduce.ExportJobBase: Beginning export of test
13-08-21 09:36:15 INFO input.FileInputFormat: Total input paths to process: 1
13-08-21 09:36:15 INFO input.FileInputFormat: Total input paths to process: 1
13-08-21 09:36:15 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13-08-21 09:36:15 WARN snappy.LoadSnappy: Snappy native library not loaded
13-08-21 09:36:16 INFO mapred.JobClient: Running job: job_201307251523_0064
13-08-21 09:36:17 INFO mapred.JobClient: map 0 reduce 0
13-08-21 09:36:23 INFO mapred.JobClient: map 25% reduce 0
13-08-21 09:36:25 INFO mapred.JobClient: map 100% reduce 0
13-08-21 09:36:27 INFO mapred.JobClient: Job complete: job_201307251523_0064
13-08-21 09:36:27 INFO mapred.JobClient: Counters: 18
13-08-21 09:36:27 INFO mapred.JobClient: Job Counters
13-08-21 09:36:27 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=13151
09:36:27 on 13-08-21 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms) = 0
09:36:27 on 13-08-21 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms) = 0
13-08-21 09:36:27 INFO mapred.JobClient: Rack-local map tasks=2
13-08-21 09:36:27 INFO mapred.JobClient: Launched map tasks=4
13-08-21 09:36:27 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
13-08-21 09:36:27 INFO mapred.JobClient: File Output Format Counters
13-08-21 09:36:27 INFO mapred.JobClient: Bytes Written=0
13-08-21 09:36:27 INFO mapred.JobClient: FileSystemCounters
13-08-21 09:36:27 INFO mapred.JobClient: HDFS_BYTES_READ=1519
13-08-21 09:36:27 INFO mapred.JobClient: FILE_BYTES_WRITTEN=234149
13-08-21 09:36:27 INFO mapred.JobClient: File Input Format Counters
13-08-21 09:36:27 INFO mapred.JobClient: Bytes Read=0
13-08-21 09:36:27 INFO mapred.JobClient: Map-Reduce Framework
13-08-21 09:36:27 INFO mapred.JobClient: Map input records=6
09:36:27 on 13-08-21 INFO mapred.JobClient: Physical memory (bytes) snapshot=663863296
13-08-21 09:36:27 INFO mapred.JobClient: Spilled Records=0
09:36:27 on 13-08-21 INFO mapred.JobClient: CPU time spent (ms) = 3720
09:36:27 on 13-08-21 INFO mapred.JobClient: Total committed heap usage (bytes) = 2013790208
09:36:27 on 13-08-21 INFO mapred.JobClient: Virtual memory (bytes) snapshot=5583151104
13-08-21 09:36:27 INFO mapred.JobClient: Map output records=6
13-08-21 09:36:27 INFO mapred.JobClient: SPLIT_RAW_BYTES=571
09:36:27 on 13-08-21 INFO mapreduce.ExportJobBase: Transferred 1.4834 KB in 12.1574 seconds (124.9446 bytes/sec)
13-08-21 09:36:27 INFO mapreduce.ExportJobBase: Exported 6 records. -
5. The definition of mysql string length is too short to save.
Java.io.IOException: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'pid' at row 1
At org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.close (AsyncSqlRecordWriter.java:192)
At org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close (MapTask.java:651)
At org.apache.hadoop.mapred.MapTask.runNewMapper (MapTask.java:766)
At org.apache.hadoop.mapred.MapTask.run (MapTask.java:370)
At org.apache.hadoop.mapred.Child$4.run (Child.java:255)
At java.security.AccessController.doPrivileged (Native Method)
At javax.security.auth.Subject.doAs (Subject.java:396)
At org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1149)
At org.apache.hadoop.mapred.Child.main (Child.java:249)
Caused by: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'pid' at row 1
At com.mysql.jdbc.MysqlIO.checkErrorPacket (MysqlIO.java:2983)
At com.mysql.jdbc.MysqlIO.sendCommand (MysqlIO.java:1631)
At com.mysql.jdbc.MysqlIO.sqlQueryDirect (MysqlIO.java:1723)
At com.mysql.jdbc.Connection.execSQL (Connection.java:3283)
At com.mysql.jdbc.PreparedStatement.executeInternal (PreparedStatement.java:1332)
At com.mysql.jdbc.PreparedStatement.execute (PreparedStatement.java:882)
At org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run (AsyncSqlOutputFormat.java:233)-
6. Date format problem
Mysql date date format. The string in hive must be yyyy-mm-dd. I used to use yyyymmdd to report the following error. 13-08-21 17:42:44 INFO mapred.JobClient: Task Id: attempt_201307251523_0079_m_000000_1, Status: FAILED
Java.io.IOException: Can't export data, please check task tracker logs
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:112)
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:39)
At org.apache.hadoop.mapreduce.Mapper.run (Mapper.java:144)
At org.apache.sqoop.mapreduce.AutoProgressMapper.run (AutoProgressMapper.java:64)
At org.apache.hadoop.mapred.MapTask.runNewMapper (MapTask.java:764)
At org.apache.hadoop.mapred.MapTask.run (MapTask.java:370)
At org.apache.hadoop.mapred.Child$4.run (Child.java:255)
At java.security.AccessController.doPrivileged (Native Method)
At javax.security.auth.Subject.doAs (Subject.java:396)
At org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1149)
At org.apache.hadoop.mapred.Child.main (Child.java:249)
Caused by: java.lang.IllegalArgumentException
At java.sql.Date.valueOf (Date.java:138)
At cps_activation.__loadFromFields (cps_activation.java:308)
At cps_activation.parse (cps_activation.java:255)
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:83)
... 10 more-
7. Field mismatch or field type inconsistency
Caused by: java.lang.NumberFormatException: For input string: "06701A4A-0808-E9A8-0D28-A8020B494E37"
At java.lang.NumberFormatException.forInputString (NumberFormatException.java:48)
At java.lang.Integer.parseInt (Integer.java:458)
At java.lang.Integer.valueOf (Integer.java:554)
At test.__loadFromFields (test.java:264)
At test.parse (test.java:201)
At org.apache.sqoop.mapreduce.TextExportMapper.map (TextExportMapper.java:83)
... 10 more
Thank you for reading this article carefully. I hope the article "what are the problems when sqoop leads to mysql from Hive" shared by the editor will be helpful to you. At the same time, I also hope you will support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.