Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to write a program on Eclipse through HDFS API to upload local files to HDFS distributed file system?

2025-04-11 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

This article will explain in detail about how to upload local files to HDFS distributed file system exceptions through HDFS API writing programs on Eclipse. Xiaobian thinks it is quite practical, so share it with you as a reference. I hope you can gain something after reading this article.

1. Uploading local files to HDFS encountered problems:

July 03, 2014 4:44:36 pm org. apache.hadoop.ipc.Client$Connection handleConnectionFailure

INFO: Retrying connect to server: master/192.168.232.134:8020. Already tried 0 time(s).

or

Call to master/192.168.232.134:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information

Solution:

Problem: Wrong path

At the beginning, my path was written as:

String dst = "hdfs://master/opt/file06.txt";

or

String dst = "hdfs://192.168.232.134/opt/file06.txt";

or

String dst = "hdfs://master:9001/opt/file06.txt";

or

String dst = "hdfs://master/opt/file06.txt";

These are all wrong, the correct path is:

String dst = "hdfs://master:9000/opt/file06.txt";

Note:

In fact,"INFO: Retrieving connect to server: master/192.168.232.134:8020. Already tried 0 time(s). "Port 8020 is shown here because we do not explicitly specify port 9000 in the program, however, the"namenode RPC interaction port "specified in the" fs.default.name"configuration item defaults to 8020, but we specify port 9000 for fs.default.name in the hdfs-site.xml file, so in the program, the connection to port 8020 fails, and the correct spelling is" hdfs:// 192.168.232.134:9000/opt/file06.txt"or"hdfs://master: 9000/opt/file06.txt ".

1) Here, if we configure master on win, then we can use master, where master is equivalent to 192.168.232.134

2), it is important to write the port correctly, here the port should be 9000

So String dst = "hdfs://master: 9000/opt/file06.txt"; and String dst = "hdfs://192.168.232.134:9000/opt/file06.txt"; both are correct

However, String dst = "hdfs://master: 9001/opt/file06.txt"; and String dst = "hdfs://192.168.232.134:9001/opt/file06.txt"; are both incorrect

2、Cannot create file/opt/file02.txt. Name node is in safe mode.

In hadoop, we often encounter the problem of safe mode, although online solutions can solve the problem, but we have never understood what it means.

In fact, when it is in safe mode, we only need to wait for it to automatically leave safe mode. We don't have to execute commands to force safe mode out.

Command to force safe mode exit: bin/hadoop dfsadmin -safemode leave

For an introduction to safe mode, see Hadoop Disarming "Name node is in safe mode"

Attached:

1), write programs to operate HDFS file system related code can be seen in the link: "How to use Java API to read and write HDFS"

2) Copy local files to hadoop file system using JAVA API

package com.langgo.hadoop3;import java.io.BufferedInputStream;import java.io.FileInputStream;import java.io.IOException;import java.io.InputStream;import java.io.OutputStream;import java.net.URI;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IOUtils;import org.apache.hadoop.util.Progressable;/** * @author hadoop * copies local files to hadoop filesystem */public class FileCopyWithProgress { public static void main(String[] args) throws IOException { String localSrc = "/home/wqj/opt/140702152709log.txt"; String dst = "hdfs://master:9000/opt/file06.txt"; FileCopyWithProgress.fileCopy(localSrc, dst); } public static void fileCopy(String localSrc, String dst) throws IOException{ InputStream in = new BufferedInputStream(new FileInputStream(localSrc)); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create(dst), conf); OutputStream out = fs.create(new Path(dst), new Progressable() { @Override public void progress() { System.out.println(". "); } }); IOUtils.copyBytes(in, out, 4096, true); }} About "How to upload local files to HDFS distributed file system exception by writing programs on Eclipse through HDFS API" This article is shared here, I hope the above content can be of some help to everyone, so that you can learn more knowledge, if you think the article is good, please share it for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report