In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly explains "how to use eclipse to write custom hive udf functions", the content of the article is simple and clear, easy to learn and understand, the following please follow the editor's ideas slowly in depth, together to study and learn "how to use eclipse to write custom hive udf functions" bar!
Eclipse writes custom hive udf functions
In the process of log analysis, hive in the hadoop framework is used, but some log processing is inadequate with the functions in hive, so we need to use udf to expand the processing.
1 create a new java project hiveudf in eclipse and then create a new class package (com.afan) name (UDFLower)
2 add jar library hadoop-core-1.1.2.jar (source hadoop1.1.2) hive-exec-0.9.0.jar (source hive-0.9.0) two files to project
Import org.apache.hadoop.hive.ql.exec.UDF; import org.apache.hadoop.io.Text; public class UDFLower extends UDF {public Text evaluate (final Text s) {if (null = = s) {return null;} return new Text (s.toString (). ToLowerCase ());}}
4 compile and output the package file as udf_hive.jar
Step one:
Step 2:
Step 3:
Step 4:
Step 5:
Step 6:
5 put udf_hive.jar into the folder of the configured linux system with the path / root/data/udf_hive.jar
6 Open the hive command line test
Hive > add jar / root/data/udf_hive.jar
Added udf_hive.jar to class path
Added resource: udf_hive.jar
Create a udf function
Hive > create temporary function my_lower as' UDFLower'; / / UDFLower' indicates the address of your class. For example, if you have a package name: cn.jiang.UDFLower.java, then as is followed by 'cn.jiang.UDFLower','. If you don't have a package name, just write the class name 'UDFLower'.
Create test data
Hive > create table dual (name string)
Import data file test.txt
The content of test.txt file is
WHO
AM
I
HELLO
Hive > load data local inpath'/ root/data/test.txt' into table dual
Hive > select name from dual
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201105150525_0003, Tracking URL = http://localhost:50030/jobdetails.jsp?jobid=job_201105150525_0003
Kill Command = / usr/local/hadoop/bin/../bin/hadoop job-Dmapred.job.tracker=localhost:9001-kill job_201105150525_0003
2011-05-1506 Stage-1 map 46 05459 Stage-1 map = 0%, reduce = 0%
2011-05-1506 Stage-1 map 46 10905%, reduce = 0%
2011-05-1506 Stage-1 map 46 13963 100%, reduce = 100%
Ended Job = job_201105150525_0003
OK
WHO
AM
I
HELLO
Use the udf function
Hive > select my_lower (name) from dual
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201105150525_0002, Tracking URL = http://localhost:50030/jobdetails.jsp?jobid=job_201105150525_0002
Kill Command = / usr/local/hadoop/bin/../bin/hadoop job-Dmapred.job.tracker=localhost:9001-kill job_201105150525_0002
2011-05-1506 Stage-1 map 4326 100 Stage-1 map = 0%, reduce = 0%
2011-05-1506 Stage-1 map 4314364 Stage-1 map = 100%, reduce = 0%
2011-05-1506 Stage-1 map 4314 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201105150525_0002
OK
Who
Am
I
Hello
Passed the test successfully.
Reference article http://landyer.iteye.com/blog/1070377
Thank you for your reading, the above is the content of "how to use eclipse to write custom hive udf functions", after the study of this article, I believe you have a deeper understanding of how to use eclipse to write custom hive udf functions, the specific use of the situation also needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.