In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
1. Environment description
Master 192.168.0.223 mesos-master
Slave 192.168.0.225 mesos-salve
two。 Environmental preparation
Turn off the firewall
Close selinux
Two machines modify the hostname master/slave
Set hosts to parse each other
3.master and slave configure ssh mutual trust
Mutual trust between hadoop users is configured here, because hadoop is started with hadoop users
Master yum-y install sshpass ssh-keygen enter all the way ssh-copy-id-I ~ / .ssh/id_rsa.pub hadoop@192.168.0.220slave yum-y install sshpass ssh-keygen all the way ssh-copy-id-I ~ / .ssh/id_rsa.pub hadoop@192.168.0.201 test ssh the other host, if you are not prompted for a password, OK
4. Install JDK
Tar zxvf jdk-8u65-linux-x64.tar.gzmv jdk1.8.0_65 / usr/jdk
4.1 set environment variables
Export JAVA_HOME=/usr/jdkexport JRE_HOME=/usr/jdk/jreexport CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JRE_HOME/libexport PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin executes source / etc/profile
4.2 Test JDK
Java-version # version information appears
5. Install mesos master and slave and read other blogs
When the installation is complete, a libmesos.so file will be generated under / usr/local/lib
6. Install and configure Hadoop
Master and slave
Tar zxvf hadoop-2.5.0-cdh6.4.8.tar.gzmv hadoop-2.5.0-cdh6.4.8 / usr/hadoopcd / usr/hadoopmkdir-p tmpcd / usr/hadoop/mv bin bin-mapreduce2/ ln-s bin-mapreduce1 binmv example example-mapreduce2 ln-s example-mapreduce1 examplecd etc/mv hadoop hadoop-mapreduce2ln-s hadoop-mapreduce1 hadoop
7. Add hadoop environment variable
Vim / etc/profile export HADOOP_HOME=/usr/hadoop export PATH=$PATH:$HADOOP_HOME:$HADOOP_HOME/binsource / etc/profile
8. Get the jar package of hadoop on mesos
Yum-y install maven openjdk-7-jdk git git clone cd hadoop mvn package # gets the jar package, and the jar package will be under target
9. Put the acquired jar package in the hadoop installation directory
Master and slave
Cp hadoop/target/hadoop-mesos-0.1.0.jar / usr/hadoop/share/hadoop/common/lib/
10. Configure hadoop on mesos
Master and slave
Vim / usr/hadoop/etc/hadoop/mapred.site.xml mapred.job.tracker localhost:9001 mapred.jobtracker.taskScheduler org.apache.hadoop.mapred.MesosScheduler mapred.mesos.taskScheduler org.apache.hadoop.mapred.JobQueueTaskScheduler mapred.mesos.master zk://192.168.0.223 mapred.mesos.executor.uri hdfs://localhost:9000/hadoop-2.5.0-cdh6.2.0.tar.gz
11. Give hadoop user rights
Master and slave
Chown-R hadoop:hadoop / usr/hadoop
twelve。 Start jobtracker on master and connect to mesos
Su hadoop MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so hadoop jobtracker
13. test
Enter 192.168.0.223 5050 to see if there is any hadoop in the frame.
-
Second, after building HDFS, build hadoop on mesos
After configuring HDFS according to the build hadoop document, you need to configure core-site.xml,hdfs-site.xml,mapred-site.xml
Mv bin bin-mapreduce2ln-s bin-mapreduce1 bin# does not need to move hadoop. When building a hadoop on mesos, modify it directly in hadoop.
two。 Move hdfs commands and start-dfs.sh
Cd / usr/hadoop/bin-mapreduce2 cp hdfs / usr/hadoop/bin-mapreduce1cd / usr/hadoop/sbincp start-dfs.sh / sur/hadoop/bin-mapreduce1
3. Set up hadoop on mesos
Modify mapred-site.xml configuration file
Mapred.job.tracker localhost:9002 # changed to 9002 to avoid conflicting mapred.jobtracker.taskScheduler org.apache.hadoop.mapred.MesosScheduler mapred.mesos.taskScheduler org.apache.hadoop.mapred.JobQueueTaskScheduler mapred.mesos.master zk://192.168.0.223 mapred.mesos.executor.uri hdfs://localhost:9000/hadoop-2.5.0-cdh6.2.0.tar.gz with ports in hdfs-site.xml
4. Format and start HDFS
Hdfs namenode-foramt start-dfs.sh
5. Start hadoop on mesos
Su hadoop MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so hadoop jobtracker
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.