In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
Foreword:
The previous blogs mainly introduced the architecture design and simple workflow of MapReduce and Yarn. This article will take the wordcount program as an example to briefly introduce the use of Yarn.
1.wordcount example runs [root@hadoop000 ~] # su-hadoop [hadoop@hadoop000 ~] $jps9201 SecondaryNameNode9425 ResourceManager13875 Jps9540 NodeManager8852 NameNode8973 DataNode# to create a wordcount directory [hadoop@hadoop000 ~] $hdfs dfs-mkdir-p / wordcount/input [hadoop@hadoop000 ~] $vi test.logjepson ruozehero yimi xjp123a b a [hadoop@hadoop000 ~] $hdfs dfs-put test.log / wordcount/input [hadoop@hadoop000 ~] $hdfs dfs-ls / wordcount/input Found 1 items-rw-r--r- -1 hadoop supergroup 37 2018-05-29 20:38 / wordcount/input/test.log# execute wordcount sample jar package [hadoop@hadoop000 ~] $yarn jar\ > / opt/software/hadoop-2.8.1/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.8.1.jar\ > wordcount\ > / wordcount/input\ > / wordcount/output18/05/29 20:40:59 INFO client.RMProxy: Connecting to ResourceManager at / 0.0.0.0: 803218 number of splits:118/05/29 05 INFO input.FileInputFormat 29 20:40:59 INFO input.FileInputFormat: Total input files to process: 118-05-29 20:41:00 INFO mapreduce.JobSubmitter: number of splits:118/05/29 20:41:00 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1526991305992_000118/05/29 20:41:01 INFO impl.YarnClientImpl: Submitted application application_1526991305992_000118/05/29 20:41:01 INFO mapreduce.Job: The url to track the job: http://hadoop000:8088/proxy/application_1526991305992 _ 0001INFO mapreduce.Job 18 reduce 29 20:41:01 INFO mapreduce.Job: Running job: job_1526991305992_000118/05/29 20:41:14 INFO mapreduce.Job: Job job_1526991305992_0001 running in uber mode: false18/05/29 20:41:14 INFO mapreduce.Job: map 0 reduce 0 reduce 29 20:41:23 INFO mapreduce.Job: map 100% reduce 0 Charpy 29 20:41:29 INFO mapreduce.Job: map reduce 100-05-29 20:41: 30 INFO mapreduce.Job: Job job_1526991305992_0001 completed successfully18/05/29 20:41:30 INFO mapreduce.Job: Counters: 4 viewing results [hadoop@hadoop000 ~] $hdfs dfs-ls / wordcount/outputFound 2 items-rw-r--r-- 1 hadoop supergroup 0 2018-05-29 20:41 / wordcount/output/_SUCCESS-rw-r--r-- 1 hadoop supergroup 51 2018-05-29 20:41 / wordcount/output/part- Rmur00000 [hadoop@hadoop000 ~] $hdfs dfs-cat / wordcount/output/part-r-00000123 1a 2b 1hero 1jepson 1ruoze 1xjp 1yimi 1
Log on to the web page to view related information: http://192.168.6.217:8088/cluster
2.Yarn common command summary yarn jar-- run a jar fileyarn application-list-- list the running jobyarn application-kill application_1526991305992_0001 (job's id)-- kill the running job3.wordcount process
Reference: https://blog.csdn.net/yczws1/article/details/21794873
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.