Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Installation method of hadoop-spark under centos

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces "the installation method of hadoop-spark under centos". In the daily operation, I believe that many people have doubts about the installation method of hadoop-spark under centos. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful to answer the doubts about "installation method of hadoop-spark under centos". Next, please follow the editor to study!

Environment: centos + hadoop2.5.2 + scala-2.10.5 + spark1.3.1

1. Download the compiled spark from http://spark.apache.org/downloads.html

2. Prepare scala

Download scala-2.10.5.rpm from http://www.scala-lang.org/. Do not download 2.11 because spark will be recompiled when downloading 2.11

Install scala rpm-ivh scala-2.10.5.rpm

3. Decompress spark

Tar-zxvf spark-1.3.1-bin-hadoop2.4.tar.gz

4. Configure environment variables

Add export SPARK_HOME=/usr/local/spark-1.3.1-bin-hadoop2.4 export PATH=$PATH:$SPARK_HOME/bin # effective source / etc/profile at the end of / etc/profile

5. Configure spark

Vi / usr/local/spark-1.3.1-bin-hadoop2.4/conf/spark-env.sh is added at the end: export JAVA_HOME=/usr/java/jdk1.7.0_76 export SPARK_MASTER_IP=192.168.1.21 export SPARK_WORKER_MEMORY=2g export HADOOP_CONF_DIR=/usr/local/hadoop-2.5.2/etc/hadoop

6. Configure the slave node

Vi / usr/local/spark-1.3.1-bin-hadoop2.4/conf/slaves master slaver1

7. Copy the configuration file to slave node

Scp-r / usr/local/spark-1.3.1-bin-hadoop2.4/ root@slaver1:/usr/local/

8. Start the cluster

Cd / usr/local/spark-1.3.1-bin-hadoop2.4/sbin./start-all.sh

9. Check whether the cluster starts successfully

Jps# master check whether there is: Master,Worker# slaver check whether there is: Worker

Visit http://master:8080/

At this point, the study on "the installation method of hadoop-spark under centos" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report