In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-11 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article is about how java uses IO streaming to upload and download hdfs data. The editor thinks it is very practical, so share it with you as a reference and follow the editor to have a look.
IO operation-File upload
1. Upload the hello.txt file under the local d disk to the HDFS root directory
2. Code writing
@ Test public void putFileToHDFS () throws IOException, InterruptedException, URISyntaxException {/ / 1 get file system Configuration configuration = new Configuration (); FileSystem fs = FileSystem.get (new URI ("hdfs://hadoop100:9000"), configuration, "root"); / / 2 create input stream FileInputStream fis = new FileInputStream (new File ("d:/hello.txt")) / / 3 get output stream FSDataOutputStream fos = fs.create (new Path ("/ hello.txt")); / / 4 stream copy IOUtils.copyBytes (fis, fos, configuration); / / 5 disable resource IOUtils.closeStream (fos); IOUtils.closeStream (fis); fs.close ();}
3. Enter http://hadoop100:50070/explorer.html#/ in the browser to check whether to upload.
IO Action-File download
1. Download the hello.txt file from HDFS to the root directory of d disk
2. Code writing
@ Test public void getFileFromHDFS () throws IOException, InterruptedException, URISyntaxException {/ / 1 get file system Configuration configuration = new Configuration (); FileSystem fs = FileSystem.get (new URI ("hdfs://hadoop100:9000"), configuration, "root"); / / 2 get input stream FSDataInputStream fis = fs.open (new Path ("/ hello.txt")) / / 3 get output stream FileOutputStream fos = new FileOutputStream (new File ("d:/hello.txt")); / / 4-stream copy IOUtils.copyBytes (fis, fos, configuration); / / 5 disable resource IOUtils.closeStream (fos); IOUtils.closeStream (fis); fs.close ();}
3. Check whether there are any files in the directory
IO Operation-Block read
1. Read large files on the HDFS in blocks, such as / hadoop-2.7.2.tar.gz in the root directory
2. Download the first block with the following code
@ Testpublic void readFileSeek1 () throws IOException, InterruptedException, URISyntaxException {
/ / 1 get file system Configuration configuration = new Configuration (); FileSystem fs = FileSystem.get (new URI ("hdfs://hadoop100:9000"), configuration, "root"); / / 2 get input stream FSDataInputStream fis = fs.open ("/ hadoop-2.7.2.tar.gz"); / / 3 create output stream FileOutputStream fos = new FileOutputStream (new File ("d:/hadoop-2.7.2.tar.gz.part1")) / / copy of 4 streams byte [] buf = new byte [1024]; for (int I = 0; I
< 1024 * 128; i++){ fis.read(buf); fos.write(buf); } // 5关闭资源 IOUtils.closeStream(fis); IOUtils.closeStream(fos);fs.close();} 3、下载第二块,代码如下 @Testpublic void readFileSeek2() throws IOException, InterruptedException, URISyntaxException{ // 1 获取文件系统 Configuration configuration = new Configuration(); FileSystem fs = FileSystem.get(new URI("hdfs://hadoop100:9000"), configuration, "root"); // 2 打开输入流 FSDataInputStream fis = fs.open(new Path("/hadoop-2.7.2.tar.gz")); // 3 定位输入数据位置 fis.seek(1024*1024*128); // 4 创建输出流 FileOutputStream fos = new FileOutputStream(new File("d:/hadoop-2.7.2.tar.gz.part2")); // 5 流的对拷 IOUtils.copyBytes(fis, fos, configuration); // 6 关闭资源 IOUtils.closeStream(fis); IOUtils.closeStream(fos);} 4、合并文件,把两块文件合并为一个 在Window命令窗口中进入到目录E:\,然后执行如下命令,对数据进行合并 type hadoop-2.7.2.tar.gz.part2 >> hadoop-2.7.2.tar.gz.part1. Rename hadoop-2.7.2.tar.gz.part1 to hadoop-2.7.2.tar.gz after the merge is completed. Unzipping found that the tar package is very complete.
Thank you for reading! This is the end of this article on "how java uses IO streaming to upload and download hdfs data". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, you can share it out for more people to see!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.