In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly shows you "how to use eclipse to write custom hive udf functions", the content is easy to understand, clear, hope to help you solve your doubts, the following let Xiaobian lead you to study and learn "how to use eclipse to write custom hive udf functions" this article.
Eclipse writes custom hive udf functions
In the process of log analysis, hive in the hadoop framework is used, but some log processing is inadequate with the functions in hive, so we need to use udf to expand the processing.
1 create a new java project hiveudf in eclipse and then create a new class package (com.afan) name (UDFLower)
2 add jar library hadoop-core-1.1.2.jar (source hadoop1.1.2) hive-exec-0.9.0.jar (source hive-0.9.0) two files to project
Import org.apache.hadoop.hive.ql.exec.UDF
Import org.apache.hadoop.io.Text
Public class UDFLower extends UDF {
Public Text evaluate (final Text s) {
If (null = = s) {
Return null
}
Return new Text (s.toString () .toLowerCase ())
}
}
4 compile and output the package file as udf_hive.jar
Step one:
Step 2:
Step 3:
Step 4:
Step 5:
Step 6:
5 put udf_hive.jar into the folder of the configured linux system with the path / root/data/udf_hive.jar
6 Open the hive command line test
Hive > add jar / root/data/udf_hive.jar
Added udf_hive.jar to class path
Added resource: udf_hive.jar
Create a udf function
Hive > create temporary function my_lower as' UDFLower'; / / UDFLower' indicates the address of your class. For example, if you have a package name: cn.jiang.UDFLower.java, then as is followed by 'cn.jiang.UDFLower','. If you don't have a package name, just write the class name 'UDFLower'.
Create test data
Hive > create table dual (name string)
Import data file test.txt
The content of test.txt file is
WHO
AM
I
HELLO
Hive > load data local inpath'/ root/data/test.txt' into table dual
Hive > select name from dual
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201105150525_0003, Tracking URL = http://localhost:50030/jobdetails.jsp?jobid=job_201105150525_0003
Kill Command = / usr/local/hadoop/bin/../bin/hadoop job-Dmapred.job.tracker=localhost:9001-kill job_201105150525_0003
2011-05-1506 Stage-1 map 46 05459 Stage-1 map = 0%, reduce = 0%
2011-05-1506 Stage-1 map 46 10905%, reduce = 0%
2011-05-1506 Stage-1 map 46 13963 100%, reduce = 100%
Ended Job = job_201105150525_0003
OK
WHO
AM
I
HELLO
Use the udf function
Hive > select my_lower (name) from dual
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201105150525_0002, Tracking URL = http://localhost:50030/jobdetails.jsp?jobid=job_201105150525_0002
Kill Command = / usr/local/hadoop/bin/../bin/hadoop job-Dmapred.job.tracker=localhost:9001-kill job_201105150525_0002
2011-05-1506 Stage-1 map 4326 100 Stage-1 map = 0%, reduce = 0%
2011-05-1506 Stage-1 map 4314364 Stage-1 map = 100%, reduce = 0%
2011-05-1506 Stage-1 map 4314 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201105150525_0002
OK
Who
Am
I
Hello
Passed the test successfully.
These are all the contents of the article "how to write custom hive udf functions using eclipse". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.