In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-23 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces what the basic common commands of HDFS are, which can be used for reference by interested friends. I hope you can learn a lot after reading this article.
one。 Create a HDFS file:
Public class Test4CreateFile {/ * create the HDFS file: * * / public static void main (String [] args) {try {Configuration conf = new Configuration (); URI uri = new URI ("hdfs://192.168.226.129:9000") Byte [] buff = "Hello Hadoop HDFS" .getBytes (); FileSystem fs = FileSystem.get (uri, conf); Path dfs = new Path ("hdfs://192.168.226.129:9000/studyhadoop"); FSDataOutputStream outputStream = fs.create (dfs) OutputStream.write (buff,0,buff.length); FileStatus files [] = fs.listStatus (dfs); for (FileStatus file:files) {System.out.println ("file:" + file.getPath ()) } catch (Exception e) {e.printStackTrace ();}
Two: delete the HDFS file
Public class Test5DeleteFile {/ * Delete the HDFS file: * * / public static void main (String [] args) {try {Configuration conf = new Configuration (); URI uri = new URI ("hdfs://192.168.226.129:9000") FileSystem fs = FileSystem.get (uri, conf); Path delef = new Path ("hdfs://192.168.226.129:9000/testhadoop1"); boolean isDeleted = fs.delete (delef, false); System.out.println ("isDelete:" + isDeleted) } catch (Exception e) {e.printStackTrace ();}
Three: create a HDFS directory
Create a directory file under public class Test9Mkdir {/ * HDFS * * / public static void main (String [] args) {try {Configuration conf = new Configuration (); URI uri = new URI ("hdfs://192.168.226.129:9000") FileSystem fs = FileSystem.get (uri, conf); Path dfs = new Path ("hdfs://192.168.226.129:9000/testhadoop"); boolean isMkdirs = fs.mkdirs (dfs); if (isMkdirs) {System.out.println ("Make Dir Successful!") } else {System.out.println ("Make Dir Failure!");} fs.close ();} catch (IllegalArgumentException e) {e.printStackTrace () } catch (URISyntaxException e) {e.printStackTrace ();} catch (IOException e) {e.printStackTrace ();}
Four: rename the HDFS file
Public class Test2Rename {/ * rename the HDFS file: * * / public static void main (String [] args) {try {Configuration conf = new Configuration (); URI uri = new URI ("hdfs://192.168.226.129:9000") FileSystem fs = FileSystem.get (uri,conf); Path oldpath = new Path ("hdfs://192.168.226.129:9000/testhadoop"); Path newpath = new Path ("hdfs://192.168.226.129:9000/testhadoop1") / / determine whether the file exists boolean isExists = fs.exists (oldpath); System.out.println ("isExists:" + isExists); / / rename the file fs.rename (oldpath, newpath) IsExists = fs.exists (newpath); System.out.println ("newpathisExists:" + isExists);} catch (Exception e) {e.printStackTrace ();}
5: upload local files to HDFS
Public class Test3CopyFile {/ * upload local files to HDFS * * / public static void main (String [] args) {try {Configuration conf = new Configuration (); URI uri = new URI ("hdfs://192.168.226.129:9000") FileSystem fs = FileSystem.get (uri, conf); Path src = new Path ("F:\\ 04-HadoopStudy\\ mapreduce.txt"); Path dst = new Path ("hdfs://192.168.226.129:9000/rootdir"); fs.copyFromLocalFile (src, dst) System.out.println ("Upload" + conf.get ("fs.default.name")); FileStatus files [] = fs.listStatus (dst); for (FileStatus file:files) {System.out.println (file.getPath ()) } catch (Exception e) {e.printStackTrace ();}
six。 Download files from HDFS to local
Public class Test10CopyToFile {/ * download files from HDFS to local * * / public static void main (String [] args) {try {Configuration conf = new Configuration (); URI uri = new URI ("hdfs://192.168.226.129:9000") FileSystem fs = FileSystem.get (uri, conf); Path src = new Path ("F:\"); Path dst = new Path ("hdfs://192.168.226.129:9000/studyhadoop"); fs.copyToLocalFile (dst, src) System.out.println ("DownLoad" + conf.get ("fs.default.name")); FileStatus files [] = fs.listStatus (dst); for (FileStatus file:files) {System.out.println (file.getPath ()) } catch (IllegalArgumentException e) {e.printStackTrace ();} catch (FileNotFoundException e) {e.printStackTrace ();} catch (URISyntaxException e) {e.printStackTrace () } catch (IOException e) {e.printStackTrace ();} Thank you for reading this article carefully. I hope the article "what are the basic common commands of HDFS" shared by the editor will be helpful to you. At the same time, I also hope that you will support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.