In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article introduces you to the use of Hadoop command analysis, the content is very detailed, interested friends can refer to, I hope to help you.
Hadoop commands
All hadoop commands are raised by the bin/hadoop script. Running the hadoop script without specifying parameters prints descriptions of all commands.
Usage: hadoop[--configconfdir][COMMAND][GENERIC_OPTIONS][COMMAND_OPTIONS]
Hadoop has an option parsing framework for parsing general options and run-time classes.
command option description
--configconfdir overrides the default configuration directory. The default is ${HADOOP_HOME}/conf.
GENERIC_OPTIONS General options supported by multiple commands.
COMMAND
The various commands and their options are mentioned below. These commands are divided into two groups: user command management commands.
Hadoop Command General Options
The following options are supported by dfsadmin,fs,fsck and job. The application implements Tool to support general options.
GENERIC_OPTION Description
-conf Specifies the configuration file for the application.
-D Specifies value for the specified property.
-fs Specifies namenode.
-jt Specifies jobtracker. It applies only to jobs.
-files Specifies a comma-separated list of files to copy to the mapreduce cluster. It applies only to jobs.
-libjars Specifies a comma-separated list of jar files to include in classpath. It applies only to jobs.
-archives Specifies a comma-separated list of archive files to be extracted onto the compute node. It applies only to jobs.
user command
Common commands for hadoop cluster users.
archive
Create a hadoop archive file. Hadoop Archives.
Usage: hadooperchive-archiveNameNAME *
command option description
-archiveNameName The name of the file to be created.
src The pathname of the file system, as is usually the case with regular expressions.
dest The destination directory where archive files are saved.
distcp
The Hadoop command distcp is used to recursively copy files or directories. Refer to DistCp Guide for more information.
Usage: hadoopdistcp
command option description
srcurl Source Url
desturl Destination Url
fs
Usage: hadoopfs[GENERIC_OPTIONS][COMMAND_OPTIONS]
Run a regular file system client.
The various command options can be found in the HDFS Shell Guide.
fsck
The Hadoop command is primarily used to run HDFS file system inspection tools. See Fsck for more information.
Usage: hadoopfsck[GENERIC_OPTIONS][-move]|-delete|-openforwrite][-files[-blocks[-locations|-racks]]]
command option description
The starting directory of the check.
-move damaged files to/lost+found
-delete Delete damaged files.
-openforwrite Print out write open files.
-files Print out the file being checked.
-blocks Print out block information reports.
-locations Print out the location information for each block.
-racks Print out the topology of the data-node network.
jar
The Hadoop command is mainly used to run jar files. Users can bundle their MapReduce code into jar files and execute it using this command.
Usage: hadoopjar[mainClass]args...
The streaming job is executed with this command. See the examples in Streamingexamples.
The Wordcount example is also run via the jar command. See Wordcount example.
job
Used to interact with MapReduce jobs and commands.
Usage: hadoopjob[GENERIC_OPTIONS][-submit]|[-status]|[-counter]|[-kill]|[-events]|[-history[all]]|[-list[all]]|[-kill-task]|[-fail-task]
command option description
-submit submit
-status Print map and reduce percent complete and all counters.
-counter Print the value of the counter.
-kill Kills the specified job.
-events Print details of events received by jobtracker in a given range.
-history[all]-history Prints details of the job, its failure, and why it was killed. More details about an assignment such as successful tasks, attempted tasks, etc. can be viewed by specifying the [all] option.
-list[all]-listall Displays all jobs. - List shows only jobs that are about to be completed.
-kill-task Kill task. Killed quests do not penalize failed attempts.
-fail-task Fails the task. A failed task will penalize a failed attempt.
The analysis of the use of Hadoop commands is shared here. I hope the above content can be of some help to everyone and learn more. If you think the article is good, you can share it so that more people can see it.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.