Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Hadoop cannot upload files to find the reason

2025-03-04 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

The cluster was deployed and there was a problem uploading test files to the HDFS file system. Could only be replicated to 0 nodes, instead of 1, as shown in the following figure:

Du Niang is looking for a solution:

Blog link: http://www.cnblogs.com/linjiqin/archive/2013/03/13/2957310.html

The blog pointed out that the reasons for the exception are as follows:

1. Is there enough space for the system or hdfs

There must be this. The test data is only a few dozen KB. You can use hadoop dfsadmin-report to view the report, or you can go directly to web to check the size. Localhost:50070

Is the number of 2.datanode normal?

The master node Namenode, JobTracker and other processes are there, and all slave nodes have Datanode and TaskTracker processes, so they should be excluded.

3. Whether in safemode or not

Safemode off can be seen on the Web, indicating that safe mode is off. If you have to use the command to view it, you can use dfsadmin-safemode get to see if safe mode is on. If it is safe mode, you can use dfsadmin-safemode leave to force you to leave safe mode.

4. Whether the firewall is turned off

The command to turn off the firewall on ubuntu is ufw disabled, and my cluster is already closed.

5. Close hadoop, format, restart hadoop

There's no way, but I'm sorry, it didn't make any difference to my problem.

The final solution:

It turned out that I was not good at learning, and linux was not solid in learning. I didn't see through such a common mistake. The directory where hadoop is located is running out of space, so I reported this error. I modified the address of DFS in core-site.xml, pointed to the directory with extra space, reformatted HDFS, and ran the case successfully.

Summary:

In many cases, mistakes are not what we are concerned about at present, but in very small details, our own pot, pick it up.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report