In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
How to customize functions in hive, many novices are not very clear about this, in order to help you solve this problem, the following small series will explain in detail for everyone, people who have this need can learn, I hope you can gain something.
create table tab_array(a array,b array)
row format delimited
fields terminated by '\t'
colloction items terminated by ',';
abc,helloworl,itcats 213123214,432312321
select a[0] from tab_array;
create table tab_map(name string,info map)
row format delimited
fields terminated by '\t'
colloction items terminated by ';'
map keys terminated by ';';
instance data
fengjie age:18;size:36A;addr:usa;
furong age:28;size:39C;addr:beijing;weight:90KG
load data local inpath '/home/hadoop/hivetemp/tab_map.txt' overwrite into table tab_map
insert into table tab_map select name,map('name',name,'ip',ip) from tab_ext;
create table tab_struct(name string, info struct)
row format delimited
fields terminated by '\t'
collectionitems terminated by ','
load data local inpath '/home/hadoop/hivetemp/tab_struct.txt' overwrite into table tab_struct
insert into table tab_map select name,named_struct('age',id,'tel',name,'addr',country) from tab_struct;
cli shell
hive -S -e 'select country,count(*) from tab_ext'>~/soft/e.txt
select * from tab_ext sort by id desc limit 5;
select a.ip,b.book from tab_ext a join tab_ip_book b on (a.name=b.name)
package cn.itcas.hive.udf;
import java.util.HashMap;
import org.apache.hadoop.hive.ql.exec.UDF;
public class PhoneToArea extends UDF {
private static HashMap map=new HashMap();
static{
map.put("136","beijing");
map.put("137","tianjing");
map.put("138","nanjing");
map.put("139","shanghai");
map.put("188","tokyo");
}
//transform phoneNo To specificAreaName
public String evaluate(String phoneNum){
String area=map.get(phoneNum.substring(0, 3));
return area==null? "huoxing":area;
}
//sum the flow
public int evaluate(int upFlow,int downFlow){
return upFlow+downFlow;
}
}
//method must be public or else this method cannot be used
Type hiveutil.jar
[root@hadoop04 hive-0.12.0]# mv ~/Desktop/hiveutil.jar ~/soft
hive> add jar /root/soft/hiveutil.jar
> ;
Added /root/soft/hiveutil.jar to class path
Added resource: /root/soft/hiveutil.jar
hive> create temporary function areasum as 'cn.itcas.hive.udf.PhoneToArea'
> ;
OK
Time taken: 0.006 seconds
[root@hadoop04 soft]# vi flow.data
13198530807,220,300
13233231222,330,450
13333231222,330,450
13433231222,330,450
13533231222,330,450
13633231222,330,450
13833231222,330,450
13933231222,330,450
18633231222,330,450
13233231222,330,450
hive> create table t_flow(phonenbr string ,upflow int,downflow int)
> row format delimited
> fields terminated by ',';
OK
Time taken: 14.112 seconds
hive> load data local inpath '/root/soft/flow.data' into table t_flow;
Copying data from file:/root/soft/flow.data
Copying file: file:/root/soft/flow.data
Loading data to table default.t_flow
Table default.t_flow stats: [num_partitions: 0, num_files: 1, num_rows: 0, total_size: 201, raw_data_size: 0]
OK
Time taken: 1.509 seconds
hive> select * from t_flow;
OK
13198530807 220 300
13233231222 330 450
13333231222 330 450
13433231222 330 450
13533231222 330 450
13633231222 330 450
13833231222 330 450
13933231222 330 450
18633231222 330 450
13233231222 330 450
NULL NULL
Time taken: 0.76 seconds, Fetched: 11 row(s)
hive> select phonenbr,areasum(phonenbr),areasum(upflow,downflow) from t_flow;
If an error occurs, quit and run again.
13198530807 huoxing 520
13233231222 huoxing 780
13333231222 huoxing 780
13433231222 huoxing 780
13533231222 huoxing 780
13633231222 beijing 780
13833231222 nanjing 780
13933231222 shanghai 780
18633231222 huoxing 780
13233231222 huoxing 780
Did reading the above help you? If you still want to have further understanding of related knowledge or read more related articles, please pay attention to the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.