In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Basic file system command operation, you can get detailed help files for all commands through hadoop fs-help.
The Java abstract class org.apache.hadoop.fs.FileSystem defines a file system interface for hadoop. Hadoop about file manipulation classes are basically all in the "org.apache.hadoop.fs" package, these API can support operations include: open files, read and write files, delete files, and so on.
The final interface class provided to the user in the Hadoop class library is FileSystem, which is an abstract class and can only be obtained through the get method of the class.
Construction method
This class is an abstract class, and you can pass the FileSystem instance through the following two static factory methods:
Public staticFileSystem.get (Configuration conf) throws IOException
Public staticFileSystem.get (URI uri, Configuration conf) throws IOException
Concrete method realization
1. Publicboolean mkdirs (Path f) throws IOException
Create all directories (including parent directories) at once, and f is the complete directory path.
2. PublicFSOutputStream create (Path f) throws IOException
Creates a file of the specified path object and returns an output stream for writing data
Create () has multiple overloaded versions that allow us to specify whether to force overwriting of existing files, number of file backups, write file buffer size, file block size, and file permissions.
3. Publicboolean copyFromLocal (Path src, Path dst) throws IOException
Copy local files to the file system
4. Publicboolean exists (Path f) throws IOException
Check whether a file or directory exists
5. Publicboolean delete (Path f, Boolean recursive)
Permanently delete the specified file or directory, and if f is an empty directory or file, the value of recursive will be ignored. Only when recursive=true, a non-empty directory and its contents will be deleted.
6. The FileStatus class encapsulates the metadata of files and directories in the file system, including file length, block size, backup, modification time, owner and permission information.
Use "FileStatus.getPath ()" to view all files in a directory in the specified HDFS.
PackagehdfsTest; importjava.io.IOException; importorg.apache.hadoop.conf.Configuration;importorg.apache.hadoop.fs.FSDataOutputStream;importorg.apache.hadoop.fs.FileStatus;importorg.apache.hadoop.fs.FileSystem;importorg.apache.hadoop.fs.Path; public classOperatingFiles {/ / initialization static Configuration conf = newConfiguration (); static FileSystem hdfs; static {String path = "/ usr/java/hadoop-1.0.3/conf/" Conf.addResource (newPath (path + "core-site.xml")); conf.addResource (newPath (path + "hdfs-site.xml")); conf.addResource (newPath (path + "mapred-site.xml")); path = "/ usr/java/hbase-0.90.3/conf/" Conf.addResource (newPath (path + "hbase-site.xml")); try {hdfs = FileSystem.get (conf);} catch (IOException e) {e.printStackTrace () }} / / create a direction public void createDir (String dir) throws IOException {Path path = new Path (dir); hdfs.mkdirs (path); System.out.println ("newdir\ t" + conf.get ("fs.default.name") + dir) } / / copy from local file to HDFS file public void copyFile (String localSrc,String hdfsDst) throws IOException {Path src = newPath (localSrc); Path dst = newPath (hdfsDst); hdfs.copyFromLocalFile (src,dst) / / list all the files in thecurrent direction FileStatus files [] = hdfs.listStatus (dst); System.out.println ("Uploadto\ t" + conf.get ("fs.default.name") + hdfsDst); for (FileStatus file: files) {System.out.println (file.getPath ()) }} / / create a new file public void createFile (String fileName,String fileContent) throws IOException {Path dst = newPath (fileName); byte [] bytes = fileContent.getBytes (); FSDataOutputStream output = hdfs.create (dst); output.write (bytes) System.out.println ("newfile\ t" + conf.get ("fs.default.name") + fileName);} / / list all files public void listFiles (String dirName) throws IOException {Path f = new Path (dirName); FileStatus [] status = hdfs.listStatus (f) System.out.println (dirName + "has all files:"); for (int I = 0; I
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.