Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Java API operation of HDFS (notes)

2025-03-13 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

Note: which one of the open main functions needs to be obtained

Package com.hadoop.test

Import java.net.URI

Import org.apache.hadoop.conf.Configuration

Import org.apache.hadoop.fs.FileStatus

Import org.apache.hadoop.fs.FileSystem

Import org.apache.hadoop.fs.FileUtil

Import org.apache.hadoop.fs.Path

Import org.apache.hadoop.hdfs.DistributedFileSystem

Import org.apache.hadoop.hdfs.protocol.DatanodeInfo

Public class HDFSAPITest {

/ *

1. Get the HDFS file system

, /

/ / get the file system

Public static FileSystem getFileSystem () throws Exception {

/ / read the configuration file

Configuration conf = new Configuration ()

/ / return the default file system. If you are running under a Hadoop cluster, you can obtain the default file system directly by using this method.

/ / FileSystem fs = FileSystem.get (conf)

/ / the specified file system address

URI uri = new URI ("hdfs://master:9000")

/ / returns the specified file system. If you test locally, you need to use this method to get the file system.

FileSystem fs = FileSystem.get (uri, conf)

Return fs

}

/ *

2. Creation and deletion of files / directories

, /

/ / create a file directory

Public static void mkdir () throws Exception {

/ / get the file system

FileSystem fs = getFileSystem ()

/ / create a file directory

Fs.mkdirs (new Path ("hdfs://master:9000/20191021/test"))

/ / release resources

Fs.close ()

}

/ / Delete a file or file directory

Public static void rmdir () throws Exception {

/ / return FileSystem object

FileSystem fs = getFileSystem ()

/ / Delete a file or file directory

Fs.delete (new Path ("hdfs://master:9000/20191021/test"), true)

/ / release resources

Fs.close ()

}

/ *

3. Get the file

, /

/ / get all the files in the directory

Public static void ListAllFile () throws Exception {

/ / return FileSystem object

FileSystem fs = getFileSystem ()

/ / list directory contents

FileStatus [] status = fs.listStatus (new Path ("hdfs://master:9000/"))

/ / get all file paths in the directory

Path [] listedPaths = FileUtil.stat2Paths (status)

/ / read each file in a loop

For (Path p: listedPaths) {

System.out.println (p)

}

/ / release resources

Fs.close ()

}

/ *

4. Upload / download files

, /

/ / upload the file to HDFS

Public static void copyToHDFS () throws Exception {

/ / return FileSystem object

FileSystem fs = getFileSystem ()

/ / the source file path is the path under Linux. If you test under Windows, you need to rewrite it to the path under Windows, such as E://Hadoop/weibo.txt.

/ / Path srcPath = new Path ("/ home/hadoop/weibo.txt")

Path srcPath = new Path ("E://Hadoop/weibo.txt")

/ / destination path

Path dstPath = new Path ("hdfs://master:9000/20191021/test")

/ / realize file upload

Fs.copyFromLocalFile (srcPath, dstPath)

/ / release resources

Fs.close ()

}

/ / download files from HDFS

Public static void getFile () throws Exception {

/ / return FileSystem object

FileSystem fs = getFileSystem ()

/ / Source file path

Path srcPath = new Path ("hdfs://master:9000/20191021/test/test.txt")

/ / the destination path is the path under Linux. If you test under Windows, you need to rewrite it to the path under Windows, such as E://hadoop/djt/.

/ / Path dstPath = new Path ("/ home/hadoop/")

Path dstPath = new Path ("E://hadoop/djt/")

/ / download files on hdfs

Fs.copyToLocalFile (srcPath, dstPath)

/ / release resources

Fs.close ()

}

/ *

5. Obtain HDFS cluster node information

, /

/ / get HDFS cluster node information

Public static void getHDFSNodes () throws Exception {

/ / return FileSystem object

FileSystem fs = getFileSystem ()

/ / get a distributed file system

DistributedFileSystem hdfs = (DistributedFileSystem) fs

/ / get all nodes

DatanodeInfo [] DataNodeStats = hdfs.getDataNodeStats ()

/ / print all nodes in a loop

For (int iTuno Bandi)

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report