In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article introduces how to use udtf in hive, the content is very detailed, interested friends can refer to, hope to be helpful to you.
Scenario: parsing JSON strings using UDTF
No more nonsense, just go to the code.
To write UDTF, you must inherit the GenericUDTF class and implement three methods of initialize,process,close. Initialize defines the column name and type of each row. The process method is an operation on data, that is, to split a row into multiple rows. Note that if a row has multiple columns, it needs to be a collection, and the close method can not be implemented.
Package dw.udf
Import java.util.ArrayList;import java.util.Iterator;import org.apache.hadoop.hive.ql.exec.UDFArgumentException;import org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException;import org.apache.hadoop.hive.ql.metadata.HiveException;import org.apache.hadoop.hive.ql.udf.generic.GenericUDTF;import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector;import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory;import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector Import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory;import org.json.JSONArray;import org.json.JSONObject;public class JsonParse extends GenericUDTF {/ / integrate GenericUDTF @ Override public StructObjectInspector initialize (ObjectInspector [] args) throws UDFArgumentException {if (args.length! = 1) {throw new UDFArgumentLengthException ("ExplodeMap takes only one argument") } if (args [0] .getCategory ()! = ObjectInspector.Category.PRIMITIVE) {throw new UDFArgumentException ("ExplodeMap takes string as a parameter");} ArrayList fieldNames = new ArrayList (); / / here is the column ArrayList fieldOIs = new ArrayList () FieldNames.add ("containerid"); fieldOIs.add (PrimitiveObjectInspectorFactory.javaStringObjectInspector); fieldNames.add ("first_step"); fieldOIs.add (PrimitiveObjectInspectorFactory.javaStringObjectInspector); fieldNames.add ("second_step"); fieldOIs.add (PrimitiveObjectInspectorFactory.javaStringObjectInspector); fieldNames.add ("third_step") FieldOIs.add (PrimitiveObjectInspectorFactory.javaStringObjectInspector); return ObjectInspectorFactory.getStandardStructObjectInspector (fieldNames, fieldOIs);} @ SuppressWarnings ("rawtypes") @ Override public void process (Object [] args) throws HiveException {try {JSONObject obj = new JSONObject (args [0] .toString ()); Iterator it = obj.keys () While (it.hasNext ()) {String key = (String) it.next (); JSONArray array = obj.getJSONArray (key) If (key.indexOf ("&")! =-1) {key = key.substring (0, key.indexOf ("&"));} String [] outstr = new String [4] Outstr [0] = key; for (int I = 0; I < array.length (); iTunes +) {outstr [I + 1] = array.getString (I);} forward (outstr) }} catch (Exception e) {e.printStackTrace ();} @ Override public void close () throws HiveException {}} this is the end of sharing about how to use udtf in hive. I hope the above content can help you to learn more. If you think the article is good, you can share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.