In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly shows you "how to use Hadoop counter", the content is easy to understand, clear, hope to help you solve your doubts, the following let the editor lead you to study and learn "how to use Hadoop counter" this article.
Data cleaning (ETL)
Before running the core business MapReduce program, it is often necessary to clean the data and clean up the data that does not meet the requirements of users. The cleaning process often only needs to run the Mapper program, not the Reduce program.
1. Demand
Remove logs whose field length is less than or equal to 11.
(1) input data
Web.log
(2) expected output data
The field length of each row is greater than 11
2. Demand analysis
The input data needs to be filtered and cleaned according to the rules in the Map phase.
3. Implementation code
(1) write LogMapper class
Package com.atguigu.mapreduce.weblog;import java.io.IOException;import org.apache.hadoop.io.LongWritable;import org.apache.hadoop.io.NullWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Mapper;public class LogMapper extends Mapper {Text k = new Text (); @ Override protected void map (LongWritable key, Text value, Context context) throws IOException, InterruptedException {/ / 1 get 1 row of data String line = value.toString () / / 2 parse log boolean result = parseLog (line,context); / 3 illegal log exit if (! result) {return;} / / 4 set key k.set (line); / / 5 write data context.write (k, NullWritable.get ());} / / 2 parse log private boolean parseLog (String line, Context context) {/ / 1 intercept String [] fields = line.split ("") / / 2 valid if (fields.length > 11) {/ / system counter context.getCounter ("map", "true") .increment (1); return true;} else {context.getCounter ("map", "false") .increment (1); return false;}
(2) write LogDriver class
Package com.atguigu.mapreduce.weblog;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.NullWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat Public class LogDriver {public static void main (String [] args) throws Exception {/ / input / output path needs to be set according to the actual input / output path on your computer args = new String [] {"e:/input/inputlog", "e:/output1"}; / / 1 get job information Configuration conf = new Configuration (); Job job = Job.getInstance (conf); / / 2 load jar package job.setJarByClass (LogDriver.class) / / 3 associate map job.setMapperClass (LogMapper.class); / / 4 set final output type job.setOutputKeyClass (Text.class); job.setOutputValueClass (NullWritable.class); / / set the number of reducetask to 0 job.setNumReduceTasks (0); / / set input and output paths FileInputFormat.setInputPaths (job, new Path (args [0])); FileOutputFormat.setOutputPath (job, new Path (args [1])) / / 6 submit job.waitForCompletion (true);}} these are all the contents of the article "how to use Hadoop counters". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.