In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/03 Report--
What are the common commands of hadoop cluster under linux? Hadoop is a distributed system infrastructure developed by the Apache Foundation. Users can develop distributed programs without knowing the underlying details of the distribution. Make full use of the power of the cluster for high-speed operation and storage.
Upload files
1) hadoop fs-put words.txt / path/to/input/2) hdfs dfs-put words.txt / path/wc/input/
View the files contained in a directory
1) hadoop fs-ls / path/wc/input/2) hadoop fs-ls hdfs://node1:9000/path/wc/input/
As in the linux file system, the permissions to which the file belongs
1) hadoop fs-chmod 666 / hello.txt2) hadoop fs-chown someuser:somegrp / hello.txt
Du Statistics File size
Hadoop fs-du-h / # Statistics the size of each file under the folder hadoop fs-du-s-h / # calculate the total size of this folder. The returned data is the folder size, the total backup size hadoop fs-count / # statistics the number of files, the returned data is the number of directories, the number of files, the total file size, input path
Kill yarn task
Yarn application-kill application_id
View yarn Log
Yarn logs-applicationId application_id > logs.txt
Container cloud products are implemented by deploying container services on cluster servers through docker technology, with tens of thousands of Linux images, powerful, easy to use, and easy to build cluster services.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.