In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Create table create table mydb.userinfo (name string,addressi string) ROW FORMAT DELIMITEDFIELDS TERMINATED BY'\ t'LINES TERMINATED BY'\ n'STORED AS TEXTFILE; create partition table CREATE TABLE mydb.userinfo-create tables (col1 string, col2 date, col3 double), partitioned by (datekey date),-- combined partition ROW FORMAT DELIMITEDFIELDS TERMINATED BY', 'Stored AS TEXTFILE; data can be imported into table mydb.userinfo load data local inpath "/ home/dahaizi/data/userinfo.txt" overwrite into table mydb.userinfo Insert data into the table
Insert into table (col1,col2,col3) values ('axiajiajiaoyun')
Insert the queried data into an existing table
INSERT INTO TABLE table_Name
PARTITION (DateKey)
SELECT col1,col2,col3,DateKey FROM otherTable
WHERE DATEKEY IN (2017-02-26)
GROUP BY col1,col2,col3,DateKey
DISTRIBUTE BY DateKey
Store the queried data in the hdfs directory of insert overwrite directory'/ jc_bdcqs/qsy'row format delimitedfields terminated by', 'select * from zqsroomgslistquitHQL to query the common setting items
1) set the calculation fault tolerance rate (to prevent abnormal exit from the program due to errors in the calculation process):
Set mapred.max.map.failures.percent=100
2) limit the number of query output files
Set mapred.reduce.tasks=1
3) controlling the maximum number of reduce will not affect the setting of mapred.reduce.tasks
Set hive.exec.reducers.max = 100
4) how many reducer will be processed in a job? default is 1G.
Set hive.exec.reducers.bytes.per.reducer = 1000000000
Set up dynamic partitions
Set hive.exec.dynamic.partition=true; (can be seen through this statement: set hive.exec.dynamic.partition;), set hive.exec.dynamic.partition.mode=nonstrict; SET hive.exec.max.dynamic.partitions=100000; (if the number of automatic partitions is greater than this parameter, an error will be reported), SET hive.exec.max.dynamic.partitions.pernode=100000; deletes the table
Drop table tb_name
Or empty the table.
Truncate table table_name
Delete partition
ALTER TABLE table_Name DROP PARTITION (Datekey='20190606')
New partition
Alter table tb_name add partition (Datekey = '20190606')
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.