In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly explains "how to use MapReduce to list the names and salaries of employees whose wages are higher than their superiors". The content of the explanation in the article is simple and clear, and it is easy to learn and understand. please follow the editor's train of thought to study and learn "how to use MapReduce to list the names and wages of employees with higher wages than their superiors."
data
EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
7369 SMITH CLERK 7902 17-December-80800 20 7499 ALLEN SALESMAN 7698 20-February-81 1600 1600 30 7521 WARD SALESMAN 7698 22-February-81 1250 1250 30 7566 JONES MANAGER 7839 02-April-81 2975 20 7654 MARTIN SALESMAN 7698 28-September-81 1250 1400 30 7698 BLAKE MANAGER 7839 01-May-81 2850 30 7782 CLARK MANAGER 7839 09-June-81 2450 10 7839 KING PRESIDENT 17-November-81 5000 10 7844 TURNER SALESMAN 7698 08-September-81 1500 0 30 7900 JAMES CLERK 7698 03-December-81 950 30 7902 FORD ANALYST 7566 03-December- 81 3000 20 7934 MILLER CLERK 7782 23-January-82 1300 10 Code package cn.kissoft.hadoop.week07 Import java.io.IOException;import java.text.DateFormat;import java.text.SimpleDateFormat;import java.util.ArrayList;import java.util.Date;import java.util.List;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.conf.Configured;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.LongWritable;import org.apache.hadoop.io.NullWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.Mapper Import org.apache.hadoop.mapreduce.Reducer;import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;import org.apache.hadoop.util.Tool;import org.apache.hadoop.util.ToolRunner;import cn.kissoft.hadoop.util.HdfsUtil / * Homework-05: list the names of employees whose salary is higher than their boss and their salary * * @ author wukong (jinsong.sun@139.com) * / public class MorePayThanHigherups extends Configured implements Tool {public static class M extends Mapper {@ Override public void map (LongWritable key, Text value, Context context) throws IOException, InterruptedException {String line = value.toString (); String id = line.substring (1,11). Trim (); String name = line.substring (11,21). Trim () String sal = line.substring (57, 68). Trim (); String pid = line.substring (32, 43). Trim (); context.write (new Text (pid), new Text ("EMP," + pid + "," + name + "," + sal + "," + id)); context.write (new Text (id), new Text ("BOSS," + id + "," + name + "," + sal + "," + pid) }} public static class R extends Reducer {@ Override public void reduce (Text key, Iterable values, Context context) throws IOException, InterruptedException {String bossName = null; int bossSal = 0; List emps = new ArrayList (); for (Text value: values) {System.out.println (value); String [] ss = value.toString (). Split (",") If (ss [0] .equals ("EMP")) {/ / there may be multiple emps.add (new Emp (ss [2], Integer.parseInt (ss [3]));} else if (ss [0] .equals ("BOSS")) {/ / only one bossName = ss [2]; bossSal = Integer.parseInt (ss [3]) } for (Emp e: emps) {if (bossSal > 0 & & e.getSal () > bossSal) {context.write (null, new Text (e.getName () + "," + e.getSal () + "," + bossName + "," + bossSal));} @ Override public int run (String [] args) throws Exception {Configuration conf = getConf (); Job job = new Job (conf, "Job-TotalSalaryByDeptMR") Job.setJarByClass (this.getClass ()); job.setMapperClass (M.class); job.setMapOutputKeyClass (Text.class); job.setMapOutputValueClass (Text.class); job.setReducerClass (R.class); job.setOutputFormatClass (TextOutputFormat.class); job.setOutputKeyClass (NullWritable.class); / / specify the format of the output KEY job.setOutputValueClass (Text.class); / / specify the format of the output VALUE FileInputFormat.addInputPath (job, new Path (args [0])) / / input path FileOutputFormat.setOutputPath (job, new Path (args [1])); / / output path return job.waitForCompletion (true)? 0: 1; / / job.waitForCompletion (true); / / return job.isSuccessful ()? 0: 1 } / * @ param args hdfs://bd11:9000/user/wukong/w07/emp.txt hdfs://bd11:9000/user/wukong/w07/out05/ * @ throws Exception * / public static void main (String [] args) throws Exception {checkArgs (args); HdfsUtil.rm (args [1], true); Date start = new Date (); int res = ToolRunner.run (new Configuration (), new MorePayThanHigherups (), args) PrintExcuteTime (start, new Date ()); System.exit (res);} / * * determine whether the number of parameters is correct, and if there are no parameters to run, it will be displayed as a program description. * * @ param args * / private static void checkArgs (String [] args) {if (args.length! = 2) {System.err.println (""); System.err.println ("Usage: Test_1
< input path > < output path >Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.