In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Hive storage is based on the hadoop hdfs file system and organizes metadata access through a default embedded Derby database or an external database system such as mysql. The stored procedure is described in an actual case below.
1. Create the table in hive, and then import the external csv file into it (the external file is Batting.csv and the internal table is temp_batting):
Hive > create table temp_batting (col_value STRING)
Hive > show tables
OK
Temp_batting
...
Hive > LOAD DATAINPATH'hive/data/Batting.csv' OVERWRITE INTO TABLE temp_batting
2. Looking at the external mysql database, you can see the newly created temp_ matching table:
Mysql > use hive
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with-A
Mysql > select * from TBLS
+- -+
| | TBL_ID | CREATE_TIME | DB_ID | LAST_ACCESS_TIME | OWNER | RETENTION | SD_ID | TBL_NAME | TBL_TYPE | VIEW_EXPANDED_TEXT | VIEW_ORIGINAL_TEXT |
+- -+-
| | 66 | 1432707070 | 1 | 0 | root | 0 | 66 | temp_batting | MANAGED_TABLE | NULL | NULL |
| | |
+- -+-
...
View its storage path on hdfs:
Mysql > select * from SDS
+-+- -+-+ -+-+
| | SD_ID | CD_ID | INPUT_FORMAT | IS_COMPRESSED | IS_STOREDASSUBDIRECTORIES | LOCATION | NUM_BUCKETS | OUTPUT_FORMAT | SERDE_ID |
+-+- -
| | 66 | 71 | org.apache.hadoop.mapred.TextInputFormat | hdfs://localhost:9000/user/hive/warehouse/temp_batting |-1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat | 66 |
You can see that it is:
Hdfs://localhost:9000/user/hive/warehouse/temp_batting
3. Check the table path in the hdfs file system of hadoop:
[root@lr rli] # hadoop dfs-ls / user/hive/warehouse
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
...
Drwxr-xr-x-root supergroup 02015-05-27 14:16 / user/hive/warehouse/temp_batting
...
[root@lr rli] # hadoop dfs-ls/user/hive/warehouse/temp_batting
DEPRECATED: Use of this script to execute hdfs command isdeprecated.
Instead use the hdfs command for it.
Found 1 items
-rwxr-xr-x 1 root supergroup 6398990 2015-05-2714 purl 02 / user/hive/warehouse/temp_batting/Batting.csv
You can see the file size and contents.
Conclusion:
Hive records the storage path and attributes of files through the associated database system, and the actual data is stored in the hdfs system. When the corresponding map/reduce process is generated through select and other operations, the data is further analyzed and processed.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.