In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article will give you a detailed explanation on how to achieve Hadoop Hdfs configuration. The editor thinks it is very practical, so I share it with you as a reference. I hope you can get something after reading this article.
Hadoop Hdfs configuration
Environment:
Jdk1.6
Hadoop-2.20.1
Fuse-2.8.1
Jdk1.6 download address
Http://cds.sun.com/is-bin/INTERSHOP.enfinity/WFS/CDS-CDS_Developer-Site/en_US/-/USD/VerifyItem-Start/jdk-6u14-linux-i586.bin?BundledLineItemUUID=.1JIBe.mR4oAAAEh2jkH4cNK&OrderID=SHBIBe.mbIYAAAEhwDkH4cNK&ProductID=4ZhIBe.o8c4AAAEhkh0zLjfT&FileName=/jdk-6u14-linux-i586.bin
Hadoop-2.20.1 download address http://www.apache.org/dyn/closer.cgi/hadoop/core/
Fuse-2.8.1 download address http://sourceforge.net/projects/fuse/files/fuse-2.X/
NameNode192.168.1.11Centos5.3hostnamemaster-dfs
JobTracker192.168.1.11 (this can also be configured separately)
DataNode192.168.1.12Centos5.3hostname:data-dfs
Client192.168.1.13Centos5.3hostname:client-dfs
Prerequisites for Hadoop Hdfs configuration
Configure ssh automatic login. For more information, please see http://hadoop.apache.org/common/docs/r0.20.0/quickstart.html.
Installation
1. Install JDK, extract to / usr/local/jdk1.6, and configure / etc/profile environment
ExportJAVA_HOME=/usr/local/jdk1.6
ExportPATH=$JAVA_HOME/bin:$JAVA_HOME/jre:$PATH
ExportCLASSPATH=$JAVA_HOME/lib:$JAVA_HOME/lib/tools.jar
2. Download Hadoop and decompress it to the / data/soft directory of each server, and the * * directory is the same.
Tarzxvfhadoop-2.20.1.tar.gz
Configuration
Configuration description
For more information on the specific configuration options of Hadoop during Hadoop Hdfs configuration, please see http://www.hadoop.org.cn/document/cn/r0.18.2/cluster_setup.html
1. Modify conf/hadoop-env.sh, add jdk support, and exportJAVA_HOME=/usr/local/jdk1.6
2. Cd/data/soft/hadoop-2.20.1 modifies the conf/core-site.xml file in the hadoop directory by adding the following:
Fs.default.name hdfs://ws:54310// your namenode configuration, the machine name plus the port dfs.name.dir / data/hdfs/hdfs1//NameNode persistent storage namespace and transaction log local file system path. Dfs.data.dir//DataNode stores the local file system path of block data, a comma-separated list. / data/hdfs/hdfsdata1 hadoop.tmp.dir > / / the default temporary path of Hadoop, this * * configuration, and then delete the tmp directory in this file if the new node or other inexplicable DataNode cannot be started. However, if you delete this directory on the NameNode machine, you will need to re-execute the NameNode formatting command. / data/hdfs/hdfstmp Abaseforothertemporarydirectories. The number of backups required for dfs.replication// data. The default is 3 / 1.
3. Modify conf/mapred-site.xml and add the following:
Host (or IP) and port of the mapred.job.tracker//JobTracker. Ws:54311
4. Configure / masters, modify conf//masters
Master-dfs
Modify hosts file vi/etc/hosts
192.168.1.11master-dfs
192.168.1.12data-dfs
This is the end of this article on "how to achieve Hadoop Hdfs configuration". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, please share it out for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.