In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
Today, I will talk to you about how to use Hadoop Java API simply. Many people may not know much about it. In order to make you understand better, the editor has summarized the following contents for you. I hope you can get something according to this article.
Note: the jar version must be consistent with the remote hadoop version.
Maven profile:
4.0.0 Hadoop demo 1.0-SNAPSHOT 2.7.1 commons-io commons-io 2.4 org.apache.hadoop hadoop-mapreduce-client-core ${hadoop.version} Org.apache.hadoop hadoop-client ${hadoop.version} org.apache.hadoop hadoop-common ${hadoop.version} org.apache.hadoop hadoop-hdfs ${hadoop.version} org.apache. Hadoop hadoop-yarn-common ${hadoop.version} org.apache.hadoop hadoop-yarn-api ${hadoop.version} org.apache.hadoop hadoop-yarn-client ${hadoop.version} junit junit 4.12
Test case:
Package com.demo;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FSDataOutputStream;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IOUtils;import org.junit.Before;import org.junit.Test;import java.io.FileInputStream;import java.io.FileOutputStream;import java.io.IOException;import java.io.InputStream;import java.net.URI;import java.net.URISyntaxException;public class HadoopTest {FileSystem fileSystem = null @ Before public void init () throws URISyntaxException, IOException, InterruptedException {String p = "hdfs://yarn001:9000"; fileSystem = FileSystem.get (new URI (p), new Configuration (), "root") } / * Test File download * @ throws URISyntaxException * @ throws IOException * / @ Test public void downloadTest () throws URISyntaxException, IOException {Path path = new Path ("/ hadoop-2.7.1.tar.gz"); InputStream open = fileSystem.open (path); FileOutputStream fileOutputStream = new FileOutputStream ("d://hadoop"); IOUtils.copyBytes (open, fileOutputStream, 4096, true) } / * Test File upload 1 * @ throws IOException * / @ Test public void uploadFileTest1 () throws IOException {InputStream fileInputStream = new FileInputStream ("d://SpringBoot.mobi"); Path path = new Path ("/ SpringBoot"); FSDataOutputStream fsDataOutputStream = fileSystem.create (path); IOUtils.copyBytes (fileInputStream,fsDataOutputStream,4096) } / * Test File upload 2 * / @ Test public void uploadFileTest2 () throws IOException {Path localPath = new Path ("d://test.xls"); Path remoterPath = new Path ("/ testXLS"); fileSystem.copyFromLocalFile (localPath,remoterPath) } / * Test delete file * / @ Test public void delFileTest () throws IOException {Path path = new Path ("/ testXLS"); / * Delete empty directory * / boolean delete = fileSystem.delete (path, false) / * * delete non-empty directories * delete recursively * / * boolean delete1 = fileSystem.delete (path, true); * / System.out.println (delete? " Delete successful ":" delete failed ");} / * create directory test * @ throws IOException * / @ Test public void createFolder () throws IOException {Path path = new Path (" / testPath2 "); boolean mkdirs = fileSystem.mkdirs (path); System.out.println (mkdirs?" success ":" fail ");}}
Common exceptions:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.tracing.SpanReceiverHost.get (Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;) Lorg/apache/hadoop/tracing/SpanReceiverHost;at org.apache.hadoop.hdfs.DFSClient. (DFSClient.java:634) at org.apache.hadoop.hdfs.DFSClient. (DFSClient.java:619)
Exception handling:
Configure the pom.xml file of maven to match the local lib version with the HDFS version of the remote Hadoop.
After reading the above, do you have any further understanding of how to use Hadoop Java API easily? If you want to know more knowledge or related content, please follow the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.