Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Construction of stand-alone Spark environment under Windows 10

2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

[objective]

Windows 10 bare metal builds a Spark environment that enables PySpark Shell to run.

[raw material]

Windows 10 x64

Jdk-8u162-windows-x64.exe

Python-3.6.7-amd64.exe

Spark-2.3.2-bin-hadoop2.7.tgz

Hadoop-2.7.7.tar.gz

Winutils.exe

[install JDK]

Double-click the jdk-8u162-windows-x64.exe installation, and the follow-up is basically "next" all the way.

Configure environment variables

# can be verified by echo% JAVA_HOME% JAVA_HOME: C:\ Program Files\ Java\ jdk1.8.0_162# can be verified by echo% CLASS_PATH% CLASS_PATH: C:\ Program Files\ Java\ jdk1.8.0_162\ lib# can be verified by echo% PATH% Path: C:\ Program Files\ Java\ jdk1.8.0_162\ bin

[install Python3]

Double-click python-3.6.7-amd64.exe installation

To facilitate code completion, it is recommended to install the following two third-party packages

# install ipythonpip3 install ipython-I https://pypi.doubanio.com/simple/# install pyreadlinepip3 install pyreadline-I https://pypi.doubanio.com/simple/

[install Spark]

Download spark-2.3.2-bin-hadoop2.7.tgz from http://spark.apache.org/downloads.html and extract it, and walker is decompressed to the D:\ spark directory.

At this point, pyspark is ready to run, but you will be prompted that winutils.exe cannot be found.

Configure environment variables

SPARK_HOME: d:\ spark\ spark-2.3.2-bin-hadoop2.7PATH: d:\ spark\ spark-2.3.2-bin-hadoop2.7\ bin

[install Hadoop]

Download hadoop-2.7.7.tar.gz from https://archive.apache.org/dist/hadoop/common/hadoop-2.7.7/ and extract it. If you get the error "Can not create symbolic link: the client does not have the required privileges." Run the decompression software as an administrator and then decompress it

Configure environment variables

HADOOP_HOME: d:\ spark\ hadoop-2.7.7PATH: d:\ spark\ hadoop-2.7.7\ bin

Modify the D:\ spark\ hadoop-2.7.7\ etc\ hadoop\ file, otherwise an error like "Error: JAVA_HOME is incorrectly set. Please update F:\ hadoop\ conf\ hadoop-env.cmd" may be reported.

# because the JAVA_HOME environment variable has spaces, modify set JAVA_HOME=%JAVA_HOME%#-- > set JAVA_HOME=C:\ PROGRA~1\ Java\\ jdk1.8.0_162

If you turn on cmd and input hadoop version, the following output is normal.

Hadoop 2.7.7Subversion Unknown-r c1aad84bd27cd79c3d1a7dd58202a8c3ee1ed3acCompiled by stevel on 2018-07-18T22:47ZCompiled with protoc 2.5.0From source with checksum 792e15d20b12c74bd6f19a1fb886490This command was run using / D:/spark/hadoop-2.7.7/share/hadoop/common/hadoop-common-2.7.7.jar

[winutils.exe]

Download the corresponding version of winutils.exe from https://github.com/steveloughran/winutils and put it in D:\ spark\ hadoop-2.7.7\ bin.

[Python script test]

The script is as follows (t.py)

From pyspark import SparkConf, SparkContextconf = SparkConf (). SetMaster ("local"). SetAppName ("My App") sc = SparkContext (conf = conf) print ('*% s'% sc.appName) sc.stop ()

Run the script with the spark-submit t.py command with the following output as correct.

* * My App

[FAQ]

ModuleNotFoundError: No module named 'resource'

Maybe the spark version you are using is too new for Windows 10, just switch to Spark 2.3.2 or below.

[related reading]

Configure the pyspark work environment on windows10

* walker * *

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report