Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How does Hadoop upload local files to a distributed file system as a stream

2025-02-23 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces "how Hadoop uploads local files to the distributed file system in the form of stream". In the daily operation, it is believed that many people have doubts about how Hadoop uploads local files to the distributed file system in the form of stream. The editor consulted all kinds of materials and sorted out simple and useful operation methods. I hope it will be helpful to answer the question of "how Hadoop uploads local files to the distributed file system in the form of stream"! Next, please follow the editor to study!

The explanation is as follows:

Package org.apache.hadoop.examples.yao;import java.io.File;import java.io.IOException;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FSDataInputStream;import org.apache.hadoop.fs.FSDataOutputStream;import org.apache.hadoop.fs.FileStatus;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.LocalFileSystem;import org.apache.hadoop.fs.Path Public class ReadLocalFile2Hadoop {public static void main (String [] args) throws IOException {readLocalFile2Hadoop ("/ home/yaokj/temp", "test") } / * upload local files to the folder on the distributed file system * @ param inputDir local folder * @ param hdfsDir Hadoop * @ throws IOException * / public static void readLocalFile2Hadoop (String inputDir,String hdfsDir) throws IOException {Configuration cfg = new Configuration () Cfg.addResource (new Path ("/ home/yaokj/hadoop-0.20.203.0/conf/hdfs-site.xml")); / / location on the configuration file cfg.addResource (new Path ("/ home/yaokj/hadoop-0.20.203.0/conf/core-site.xml")); FileSystem fs = FileSystem.get (cfg); LocalFileSystem localFS = FileSystem.getLocal (cfg) Fs.mkdirs (new Path (hdfsDir)); FileStatus [] inputFiles = localFS.listStatus (new Path (inputDir)); FSDataOutputStream out; FSDataInputStream in; for (int I = 0; I

< inputFiles.length ; i++) { System.out.println(inputFiles[i].getPath().getName()); in = localFS.open(inputFiles[i].getPath()); out = fs.create(new Path(hdfsDir+inputFiles[i].getPath().getName())); byte[] buffer = new byte[256]; int byteRead = 0 ; while ((byteRead = in.read(buffer)) >

0) {out.write (buffer, 0, byteRead);} out.close (); in.close () File file = new File (inputFiles [I] .getPath () .toString ()); / / System.out.println (inputFiles [I] .getPath () .toString ()); System.out.println (file.delete ());}

At this point, the study on "how Hadoop uploads local files to the distributed file system in the form of stream" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report