Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Hadoop single node environment configuration

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

1. Configure the environment

1.1. Software package

Sun jdk or openjdk: 1.7

1.2. Installation must be a software package:

# yum install ssh rsync

# rpm-ivh jdk-7u79-linux-x64.rpm

1.3.Configuring java environment

# vim / etc/profile

Export JAVA_HOME=/usr/java/latest

Export JRE_HOME=$JAVA_HOME/jre

Export PATH=$PATH:$JAVA_HOME/bin

Export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib

# vim hadoop_home/etc/hadoop/hadoop_env.sh

Export JAVA_HOME=/usr/java/latest or export JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64

2. Configuration of single node profile

# vim hadoop_home/etc/hadoop/core-site.xml

Fs.defaultFS hdfs://localhost:9000

# vim hadoop_home/etc/hadoop/hdfs-site.xml

Dfs.replication 1

3. Set ssh secret-free login

# ssh-keygen-t rsa-P''- f ~ / .ssh/id_rsa

# cat ~ / .ssh/id_rsa.pub > > ~ / .ssh/authorized_keys

# chmod 0600 ~ / .ssh/authorized_keys

4. Test environment

4.1. Format the file system

# hadoop_home/bin/hdfs namenode-format

4.2.Open NameNode and DataNode services

# hadoop_home/sbin/start-dfs.sh

4. Browse the web interface of NameNode

Http://localhost:50070/

4.4.The directory where the MapReduce work is performed

# hadoop_home/bin/hdfs dfs-mkdir / user

# hadoop_home/bin/hdfs dfs-mkdir / user/ the system user here who executes MapReduce for you

4.5. copy input files to the inverted target file system

# hadoop_home/bin/hdfs dfs-put etc/hadoop input

Run the default demo provided by the system

# hadoop_home/bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar grep input output 'dfs [a Merz.] +'

Copy the output file of the target system to the local

# hadoop_home/bin/hdfs dfs-get output output

4.8. View the execution result

# cat hadoop_home/output/*

4.9. View the destination file system output results

# hadoop_home/bin/hdfs dfs-cat output/*

At this point, the single-node hadoop environment has been configured and tested.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report