In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
1 create a project
Click project--Maven--next
2 Import project dependency jar package through maven
(1) set maven to automatically import dependent jar packages
Check Import Maven projects automatically and click apply
(2) configure pom.xml file
The pom.xml configuration file is as follows:
4.0.0 com.kaikeba.hadoop com.kaikeba.hadoop 1.0-SNAPSHOT jar 2.7.3 commons-cli commons-cli 1.2 commons-logging commons-logging 1.1.3 org.apache.hadoop Hadoop-mapreduce-client-jobclient ${hadoop.version} org.apache.hadoop hadoop-common ${hadoop.version} org.apache.hadoop hadoop-hdfs 2.7. 3 org.apache.hadoop hadoop-hdfs ${hadoop.version} org.apache.hadoop hadoop-mapreduce-client-app ${hadoop.version} org.apache.hadoop hadoop-mapreduce-client-hs ${hadoop. Version} 3 write a HDFS read-write program to transfer local files to HDFS****package com.kaikeba.hadoop.hdfs Import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IOUtils;import java.io.*;import java.net.URI;/** * writes the files of the local file system to the HDFS file through java-API * / public class FileCopyFromLocal {public static void main (String [] args) {String source = "E:\\ aa.mp4" / / get the m directory of / data exists (changed according to your environment) String destination = "hdfs://122.51.241.109:9000/data/hdfs01.mp4"; InputStream in = null; try {in = new BufferedInputStream (new FileInputStream (source)); / / configuration file Configuration conf = new Configuration () read and written by HDFS / generate a file system object FileSystem fs = FileSystem.get (URI.create (destination), conf); / / generate an output stream OutputStream out = fs.create (new Path (destination)); IOUtils.copyBytes (in, out, 4096, true);} catch (FileNotFoundException e) {e.printStackTrace () } catch (IOException e) {e.printStackTrace ();} * * transfer files from HDFS to local * * package com.kaikeba.hadoop.hdfs;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FSDataInputStream;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IOUtils;import java.io.BufferedOutputStream;import java.io.FileOutputStream Import java.io.IOException;import java.net.URI / * * read files from HDFS * package run jar package [bruce@node-01 Desktop] $hadoop jar com.kaikeba.hadoop-1.0-SNAPSHOT.jar com.kaikeba.hadoop.hdfs.FileReadFromHdfs * / public class FileReadFromHdfs {public static void main (String [] args) {try {/ / String srcFile = "hdfs://122.51.241.109:9000/data/hdfs01.mp4" Configuration conf = new Configuration (); FileSystem fs = FileSystem.get (URI.create (srcFile), conf); FSDataInputStream hdfsInStream = fs.open (new Path (srcFile)); BufferedOutputStream outputStream = new BufferedOutputStream (new FileOutputStream ("/ opt/hdfs01.mp4")); IOUtils.copyBytes (hdfsInStream, outputStream, 4096, true);} catch (IOException e) {e.printStackTrace () } 4 verify by running the jar package
Double-click package
Generate com.kaikeba.hadoop-1.0-SNAPSHOT.jar and enter it into the server to execute
Execute command: hadoop jar com.kaikeba.hadoop-1.0-SNAPSHOT.jar com.kaikeba.hadoop.hdfs.FileReadFromHdfs
Note: com.kaikeba.hadoop.hdfs.FileReadFromHdfs is the full class name and will be changed according to your own project.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.