In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
This article will introduce how to build a Spark development environment on Mac os.
Before installing the spark environment, you need to make sure that the appropriate java environment is configured, and the version of jdk or jre needs to be above version 1.8.0.
(1) first download and develop IDE. At present, the popular IDE is IntelliJ IDEA, which can be downloaded and used on the official website at https://www.jetbrains.com/idea/.
(2) build Spark environment:
Launch the terminal on mac and use the brew install scala command (brew is a software package management tool, similar to yum under centos or apt-get under ubuntu) to install and download Scala, as follows:
The scala environment has been successfully installed. Now we download the spark related environment from the official website of spark: http://spark.apache.org/
After downloading spark, go to the folder of the downloaded spark package and execute tar-zxvf spark-2.3.1-bin-hadoop2.7.tar.
After installing scala and spark, configure the relevant environment variables, execute the vim / etc/profile command, and append the following environment variables:
Export SCALA_HOME=/usr/local/Cellar/scala/2.12.4
Export PATH=$PATH:$SCALA_HOME/bin
Export SPARK_HOME=/Users/mengxin/Downloads/spark-2.3.1-bin-hadoop2.7
Export PATH=$PATH:$SPARK_HOME/bin
(3) perform native ssh configuration, and the home directory executes the following command:
Ssh-keygen-t rsa-P ""
Cat ~ / .ssh/id_rsa.pub > > ~ / .ssh/authorized_keys
The execution process is as follows:
(4) check whether spark can be started.
Go to the sbin directory under the spark package and execute. / start-all.sh, as shown below
You can view the relevant startup situation in the local.out file.
After spark starts, the master and worker processes started are actually jvm processes, which we can take a look at with the jps command.
Then test with spark shell, go to the bin directory of the spark installation package, and if the following interface appears, it means that the spark installation environment is done.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.