In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
1) Startup environment
Start-all.sh
2) production depends on the state
Jps
0613 NameNode
10733 DataNode
3455 NodeManager
15423 Jps
11082 ResourceManager
10913 SecondaryNameNode
3) use Eclipse to write jar
1. Write MapCal classes
Package com.mp
Import java.io.IOException
Import org.apache.hadoop.io.LongWritable
Import org.apache.hadoop.io.Text
Import org.apache.hadoop.mapreduce.Mapper
Public class MapCal extends Mapper {
@ Override
Protected void map (LongWritable lon, Text value, Context context)
Throws IOException, InterruptedException {
String line = value.toString ()
String [] peps = line.split ("-")
/ / key-value pair
Context.write (new Text (peps [0]), new Text ("s" + peps [1]))
Context.write (new Text (peps [1]), new Text ("g" + peps [0]))
}
}
two。 Write ReduceCal classes
Public class ReduceCal extends Reducer {
@ Override
Protected void reduce (Text arg0, Iterable arg1, Context context)
Throws IOException, InterruptedException {
ArrayList grands = new ArrayList ()
ArrayList sons = new ArrayList ()
/ / write these values to the collection
For (Text text: arg1) {
String str = text.toString ()
If (str.startsWith ("g")) {
Grands.add (text)
} else {
Sons.add (text)
}
}
/ / output
For (int I = 0; I < sons.size (); iTunes +) {
For (int j = 0; j < grands.size (); jacks +) {
Context.write (grands.get (I), sons.get (j))
}
}
}
}
3. Write Jobrun classes
Public class RunJob {
/ / fully qualified name
Public static void main (String [] args) {
Configuration conf = new Configuration ()
/ / Local multithreaded simulation execution.
/ / conf.set ("fs.defaultFS", "hdfs://node3:8020")
/ / conf.set ("mapred.jar", "C:\ Users\\ Administrator\\ Desktop\\ wc.jar")
Try {
FileSystem fs = FileSystem.get (conf)
Job job = Job.getInstance (conf)
Job.setJobName ("wc")
Job.setJarByClass (RunJob.class)
Job.setMapperClass (WordCountMapper.class)
Job.setReducerClass (WordCountReduce.class)
Job.setMapOutputKeyClass (Text.class)
Job.setMapOutputValueClass (IntWritable.class)
/ / job directory of input and output data
FileInputFormat.addInputPath (job, new Path ("/ word.txt"))
Path outPath = new Path ("/ output/wc2"); / / the directory where the execution results are stored. The directory cannot exist until it is executed.
If (fs.exists (outPath)) {
Fs.delete (outPath, true)
}
FileOutputFormat.setOutputPath (job, outPath)
Boolean f = job.waitForCompletion (true)
If (f) {
System.out.println ("Task executed successfully!")
}
} catch (Exception e) {
E.printStackTrace ()
}
}
}
4) Export jar package.
5) upload jar to linux directory via ftp
6) run the jar package
Hadoop jar shuju.jar com.mc.RunJob / / outg
7) if both map and reduce are 100%
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=45
File Output Format Counters
Bytes Written=18
Indicates that the operation is successful!
8) production depends on the results
Hadoop fs-tail / outg/part-r-00000
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.