In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-12 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article is about how to customize hive persistent functions. The editor thinks it is very practical, so share it with you as a reference and follow the editor to have a look.
Reasons for demand:
Hive itself provides a nvl function, but this function only works on null values, and now the requirement is not only for null, but also for null values like "", so you need to customize a function called nvls and integrate it into hive.
Specific steps:
1: write the contents of the java file as follows:
Package org.apache.hadoop.hive.ql.udf
Import org.apache.hadoop.hive.ql.exec.UDF
Import org.apache.hadoop.io.Text
Public class MyNvl extends UDF {
Public Text evaluate (final Text tlast final Text x) {
If (tweeted null) {
If (t.toString () .equals (")) {
Return x
}
} else {
Return x
}
Return t
}
}
2: test:
2.1A jar package named lixiyuan.jar is typed first, and then uploaded to the server. My account is zb_test.
The Jar package is placed in the / data/zb_test directory.
Then enter the hive
Select the database for the test:
Use test
Then add the jar package
Add jar / data/zb_test/lixiyuan.jar
Then create a function
Create temporary function nvls as' org.apahce.hadoop.hive.ql.udf.MyNvl'
And then you can test it.
After the test is successful, we will integrate it into hive.
3: integrate functions into hive
First, let's talk about the idea of adding permanent functions: first of all, the functions in hive are related to hive-exec-0.12.0-cdh6.0.0.jar jar packages.
Specifically, we have to do two things:
1: modify the environment variable vi. / .bashrc first. Modify the content to read as follows:
Change the CLASSPATH to the following:
Export CLASSPATH=$JAVA_HOME/lib:$JAVA_HOME/jre/lib:$SQOOP_HOME/lib:/opt/boh-2.0.0/hadoop/share/hadoop/tools/lib/*:/opt/boh-2.0.0/hadoop/share/hadoop/common/lib/*:/opt/boh-2.0.0/hadoop/share/hadoop/common/*:/opt/boh-2.0.0/hive/li
(for one line)
Compile the java file you just wrote into a class file, and then add it to the udf directory after hive-exec-0.12.0-cdh6.0.0.jar is extracted.
2: modify the FunctionRegistry file in the exec directory in hive-exec-0.12.0-cdh6.0.0.jar, but what we decompress is compiled by other people, so we need to find the source code, find the FunctionRegistry.java file, and add it:
Import org.apache.hadoop.hive.ql.udf.MyNvl
RegisterUDF ("nvls", MyNvl.class,false)
Just complete these two steps.
But here's the problem:
Completing the first step is simple, but completing the second step is a little more complicated, because after we have modified the FunctionRegistry.java file, we have to compile it into a class file before we can replace the original FunctionRegistry.class file. So now the problem is how to compile the FunctionRegistry.java file.
3: how to compile FunctionRegistry.java files:
Because there is import org.apache.hadoop.hive.ql.udf.MyNvl in FunctionRegistry.java.
RegisterUDF ("nvls", MyNvl.class,false); such information.
So let's start by creating a new directory like this:
Org/apache/hadoop/hive/ql/udf then put the MyNvl.class file in it.
Then type the org directory into a jar package. Named hello.jar.
Jar-cvf / data/zb_test/hello.jar / data/zb_test/org/
Next, modify the environment variable:
Change the CLASSPATH to the following:
Export CLASSPATH=/data/zb_test/hello.jar:$JAVA_HOME/lib:$JAVA_HOME/jre/lib:$SQOOP_HOME/lib:/opt/boh-2.0.0/hadoop/share/hadoop/tools/lib/*:/opt/boh-2.0.0/hadoop/share/hadoop/common/lib/*:/opt/boh-2.0.0/hadoop/share/hadoop/common/*:/opt/boh-2.0.0/hive/li
Then you can compile the FunctionRegistry.java file.
Javac. / FunctionRegistry.java
Then replace the original file with the generated FunctionRegistry.class file. Then repackage the decompressed things into:
Hive-exec-0.12.0-cdh6.0.0.jar replaces the original ok in hive.
Thank you for reading! This is the end of this article on "how to customize hive permanent functions". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, you can share it out for more people to see!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.