In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly shows you the "hadoop mapreducez how to customize the partition", the content is easy to understand, clear, hope to help you solve your doubts, the following let the editor lead you to study and learn "how to customize the partition of mapreducez in hadoop" this article.
Package hello_hadoop;import java.io.IOException;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.DoubleWritable;import org.apache.hadoop.io.LongWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.Mapper;import org.apache.hadoop.mapreduce.Partitioner;import org.apache.hadoop.mapreduce.Reducer;import org.apache.hadoop.mapreduce.lib.input.FileInputFormat Import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;public class AutoParitionner {public static void main (String [] args) throws IOException, ClassNotFoundException, InterruptedException {if (args.roomthkeeper 2) {System.err.println ("Usage: hadoop jar xxx.jar"); System.exit (1);} Configuration conf = new Configuration () Job job = Job.getInstance (conf, "avg of grades"); job.setJarByClass (AutoParitionner.class); job.setMapperClass (PartitionInputClass.class); job.setReducerClass (PartitionOutputClass.class); job.setMapOutputKeyClass (Text.class); job.setMapOutputValueClass (DoubleWritable.class); job.setOutputKeyClass (Text.class) Job.setOutputValueClass (DoubleWritable.class); / / the class that declares the custom partition is declared below: job.setPartitionerClass (MyPartitioner.class); job.setNumReduceTasks (2); FileInputFormat.addInputPath (job, new Path (args [0])); FileOutputFormat.setOutputPath (job, new Path (args [1])) System.exit (job.waitForCompletion (true)? 0:1);} class PartitionInputClass extends Mapper {@ Override protected void map (LongWritable key, Text value, Mapper.Context context) throws IOException, InterruptedException {String line = value.toString () If (line.length () > 0) {String [] array = line.split ("\ t"); if (array.length==2) {String name=array [0]; int grade = Integer.parseInt (array [1]) Context.write (new Text (name), new DoubleWritable (grade)) } class PartitionOutputClass extends Reducer {@ Override protected void reduce (Text text, Iterable iterable, Reducer.Context context) throws IOException, InterruptedException {int sum = 0; int cnt= 0 For (DoubleWritable iw: iterable) {sum+=iw.get (); cnt++;} context.write (text, new DoubleWritable (sum/cnt)) }} / / the class / / Partitioner Text,DoubleWirtable of the custom partition is key,valueclass MyPartitioner extends Partitioner {@ Override public int getPartition (Text text, DoubleWritable value, int numofreuceTask) {String name = text.toString () of the map result, respectively. If (name.equals ("wd") | | name.equals ("wzf") | | name.equals ("xzh") | | name.equals ("zz") {return 0;} else return 1;}} these are all the contents of the article "how to customize partitions in hadoop". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.