In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
The code for the mapper class:
Methods that implement the Mapper class
Import java.io.IOException
Import org.apache.hadoop.io.IntWritable
Import org.apache.hadoop.io.LongWritable
Import org.apache.hadoop.io.Text
Import org.apache.hadoop.mapreduce.Mapper
Public class MyMapper extends Mapper {
/ / constant
Private static final int MISSING = 9999
/ / map function
@ Override
Protected void map (LongWritable key, Text value, Mapper.Context context)
Throws IOException, InterruptedException {
/ / per line
String line = value.toString ()
/ / get year
String year = line.substring (15,19)
/ / get airtemp
Int airTemperature
If (line.charAt (87) = ='+') {
AirTemperature = Integer.parseInt (line.substring (88,92))
} else {
AirTemperature = Integer.parseInt (line.substring (87,92))
}
/ / valid air temp data
String quality = line.substring (92,93)
If (airTemperature! = MISSING & & quality.matches ("[01459]")) {
Context.write (new Text (year), new IntWritable (airTemperature))
}
}
}
The code for the Reduce class:
Methods that implement the Reducer class
Import java.io.IOException
Import org.apache.hadoop.io.IntWritable
Import org.apache.hadoop.io.Text
Import org.apache.hadoop.mapreduce.Reducer
Public class MyReducer extends Reducer {
@ Override
Protected void reduce (Text key, Iterable values, Context context)
Throws IOException, InterruptedException {
/ / max
Int maxValue = Integer.MIN_VALUE
/ / for
For (IntWritable value: values) {
MaxValue = Math.max (maxValue, value.get ())
}
/ / output
Context.write (key, new IntWritable (maxValue))
}
}
Code for the main method:
The import mapper class implements .MyMapper
The import reducer class implements .MyReducer
Import org.apache.hadoop.fs.Path
Import org.apache.hadoop.io.IntWritable
Import org.apache.hadoop.io.Text
Import org.apache.hadoop.mapreduce.Job
Import org.apache.hadoop.mapreduce.lib.input.FileInputFormat
Import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat
Public class MyMapperApp {
/ * *
* @ param args
* @ throws Exception
* @ throws IllegalArgumentException
, /
Public static void main (String [] args) throws IllegalArgumentException, Exception {
/ / new job
Job job = Job.getInstance ()
/ / find jar by ClassName
Job.setJarByClass (MyMapper.class)
/ / job name
Job.setJobName ("Max temperature")
FileInputFormat.addInputPath (job, new Path ("file:///mnt/hgfs/test-ncdc-data"));"
FileOutputFormat.setOutputPath (job, new Path ("file:///home/hadoop/mr/"));"
Job.setMapperClass (MyMapper.class)
Job.setReducerClass (MyReducer.class)
Job.setOutputKeyClass (Text.class)
Job.setOutputValueClass (IntWritable.class)
System.exit (job.waitForCompletion (true)? 0: 1)
}
}
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.