In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly explains "how MapReduce handles mobile traffic statistics". Interested friends may wish to take a look. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn "how MapReduce handles mobile traffic statistics".
The simulation metadata is as follows: HTTP_20130313143750.dat
1363157985066 13726230503 00-FD-07-A4-72-B8:CMCC 120.196.100.82 i02.c.aliimg.com 24 27 2481 24681200
1363157995052 13826544101 5C-0E-8B-C7-F1-E0:CMCC 120.197.40.4 40 264 0 200
1363157991076 13926435656 20-10-7A-28-CC-0A:CMCC 120.196.100.99 2 4 132 1512 200
1363154400022 13926251106 5C-0E-8B-8B-B1-50:CMCC 120.197.40.4 40 240 0 200
1363157993044 18211575961 94-71-AC-CD-E6-18:CMCC-EASY 120.196.100.99 iface.qiyi.com Video website 15 12 1527 2106 200
1363157995074 84138413 5C-0E-8B-8C-E8-20:7DaysInn 120.197.40.4 122.72.52.12 20 16 4116 1432
1363157993055 13560439658 C4-17-FE-BA-DE-D9:CMCC 120.196.100.99 18 15 1116 954 200
1363157995033 15920133257 5C-0E-8B-C7-BA-20:CMCC 120.197.40.4 sug.so.360.cn Information Security 20 20 3156 2936 200
1363157983019 13719199419 68-A1-B7-03-07-B1:CMCC-EASY 120.196.100.82 40 240 0 200
1363157984041 13660577991 5C-0E-8B-92-5C-20:CMCC-EASY 120.197.40.4 s19.cnzz.com site Statistics 24 9 6960 690 200
1363157973098 15013685858 5C-0E-8B-C7-F7-90:CMCC 120.197.40.4 rank.ie.sogou.com search engine 28 27 3659 3538 200
1363157986029 15989002119 E8-99-C4-4E-93-E0:CMCC-EASY 120.196.100.99 www.umeng.com site Statistics 3 3 1938 180 200
1363157992093 13560439658 C4-17-FE-BA-DE-D9:CMCC 120.196.100.99 15 9 918 4938 200
1363157986041 13480253104 5C-0E-8B-C7-FC-80:CMCC-EASY 120.197.40.4 3 3 180 180 200
1363157984040 13602846565 5C-0E-8B-8B-B6-00:CMCC 120.197.40.4 2052.flash3-http.qq.com Integrated Portal 15 12 1938 2910 200
1363157995093 13922314466 00-FD-07-A2-EC-BA:CMCC 120.196.100.82 img.qfc.cn 12 12 3008 3720 200
1363157982040 13502468823 5C-0A-5B-6A-0B-D4:CMCC-EASY 120.196.100.99 y0.ifengimg.com Integrated Portal 57 102 7335 110349 200
1363157986072 18320173382 84-25-DB-4F-10-1A:CMCC-EASY 120.196.100.99 input.shouji.sogou.com search engine 21 18 9531 2412 200
1363157990043 13925057413 00-1F-64-E1-E6-9A:CMCC 120.196.100.55 t3.baidu.com search engine 69 63 11058 48243 200
1363157988072 13760778710 00-FD-07-A4-7B-08:CMCC 120.196.100.82 2 120 120 200
1363157985079 13823070001 20-7C-8F-70-68-1F:CMCC 120.196.100.99 63 360 180 200
1363157985069 1360021750200-1F-64-E2-E8-B1:CMCC 120.196.100.55 18 138 1080 186852 200
The format of the above log is as follows
The MapReduce code is as follows
Package MapReduce;import java.io.DataInput;import java.io.DataOutput;import java.io.IOException;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.LongWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.io.Writable;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.Mapper;import org.apache.hadoop.mapreduce.Reducer;import org.apache.hadoop.mapreduce.lib.input.FileInputFormat Import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;import org.apache.hadoop.mapreduce.lib.partition.HashPartitioner;public class KpiApp {static final String INPUT_PATH = "hdfs://hadoop:9000/wlan"; static final String OUT_PATH = "hdfs://hadoop:9000/outwlan" Public static void main (String [] args) throws Exception {final Job job = new Job (new Configuration (), KpiApp.class.getSimpleName ()); / / 1.1 specify the input file path FileInputFormat.setInputPaths (job, INPUT_PATH); / / specify which class is used to format the input file job.setInputFormatClass (TextInputFormat.class) / / 1.2 specify custom Mapper class job.setMapperClass (MyMapper.class); / / specify the type of output job.setMapOutputKeyClass (Text.class); job.setMapOutputValueClass (KpiWritable.class); / / 1.3specify partition class job.setPartitionerClass (HashPartitioner.class); job.setNumReduceTasks (1) / / 1.4 TODO sorting, partition / / 1.5 TODO (optional) merge / / 2.2 specify custom reduce class job.setReducerClass (MyReducer.class); / / specify output type job.setOutputKeyClass (Text.class); job.setOutputValueClass (KpiWritable.class) / / 2.3specify where to output FileOutputFormat.setOutputPath (job, new Path (OUT_PATH)); / / set the formatting class job.setOutputFormatClass (TextOutputFormat.class) of the output file; / / submit the code to JobTracker to execute job.waitForCompletion (true) } static class MyMapper extends Mapper {protected void map (LongWritable key, Text value, org.apache.hadoop.mapreduce.Mapper.Context context) throws IOException, InterruptedException {final String [] splited = value.toString (). Split ("\ t"); final String msisdn = splited [1]; final Text K2 = new Text (msisdn) Final KpiWritable v2 = new KpiWritable (splited [6], splited [7], splited [8], splited [9]); context.write (K2, v2);} } static class MyReducer extends Reducer {/ * * @ param K2 represents different mobile phone numbers in the file * @ param V2s represents the collection of traffic of the mobile phone number in different periods of time * / protected void reduce (Text K2, java.lang.Iterable v2s, org.apache.hadoop.mapreduce.Reducer.Context context) throws IOException InterruptedException {long upPackNum = 0L Long downPackNum = 0L; long upPayLoad = 0L; long downPayLoad = 0L; for (KpiWritable kpiWritable: V2s) {upPackNum + = kpiWritable.upPackNum; downPackNum + = kpiWritable.downPackNum; upPayLoad + = kpiWritable.upPayLoad; downPayLoad + = kpiWritable.downPayLoad } final KpiWritable v3 = new KpiWritable (upPackNum+ "", downPackNum+ "", upPayLoad+ "", downPayLoad+ ""); context.write (K2, v3);};}} class KpiWritable implements Writable {long upPackNum; long downPackNum; long upPayLoad; long downPayLoad Public KpiWritable () {} public KpiWritable (String upPackNum, String downPackNum, String upPayLoad, String downPayLoad) {this.upPackNum = Long.parseLong (upPackNum); this.downPackNum = Long.parseLong (downPackNum); this.upPayLoad = Long.parseLong (upPayLoad); this.downPayLoad = Long.parseLong (downPayLoad);} @ Override public void readFields (DataInput in) throws IOException {this.upPackNum = in.readLong () This.downPackNum = in.readLong (); this.upPayLoad = in.readLong (); this.downPayLoad = in.readLong ();} @ Override public void write (DataOutput out) throws IOException {out.writeLong (upPackNum); out.writeLong (downPackNum); out.writeLong (upPayLoad); out.writeLong (downPayLoad) } @ Override public String toString () {return upPackNum + "\ t" + downPackNum + "\ t" + upPayLoad + "\ t" + downPayLoad;}}
Upload HTTP_20130313143750.dat to the hadoop HDFS file system
Run the MapReduce code to see the contents of the output / outwlan/part-* file
At this point, I believe you have a deeper understanding of "MapReduce how to deal with mobile traffic statistics". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.