In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly explains "which programming languages are supported by Spark". The content of the explanation is simple and clear, and it is easy to learn and understand. Please follow the editor's train of thought to study and learn "which programming languages are supported by Spark".
1. What is the core of Spark?
RDD is the basic abstraction of Spark and the abstract use of distributed memory. It implements the abstract implementation of manipulating distributed data sets by manipulating local sets. RDD is also the core of Spark. It represents a set of data that has been partitioned, immutable and can be operated in parallel. Different dataset formats correspond to different RDD implementations.
RDD must be serializable. RDD can be cache into memory, and the results of each operation on the RDD dataset can be stored in memory, and the next operation can be entered directly from memory, saving a lot of disk IO operations of MapReduce. For the iterative operation of the more common machine learning algorithms, interactive data mining, the efficiency is greatly improved.
2. What are the applicable scenarios for Spark?
Due to the nature of RDD, Spark is not suitable for applications with asynchronous fine-grained status updates, such as storage of web services or incremental web crawlers and indexes. It is not suitable for the application model of incremental modification. Generally speaking, Spark has a wide range of applications and is more general.
3. What are the programming languages supported by Spark?
Spark exposes RDD operations through integration with programming languages, similar to DryadLINQ and FlumeJava. Each dataset is represented as a RDD object, and the operation on the dataset is represented as an operation on the RDD object. The main programming languages supported by Spark are Scala, java and python.
1) Scala. Spark is developed using Scala and defaults to Scala as the programming language. Writing Spark programs is much easier than writing Hadoop MapReduce programs. SparK provides Spark-Shell, which can be tested in Spark-Shell.
2) Java. Spark supports Java programming, but for using Java, there is no convenient tool like Spark-Shell, other programming is the same as Scala programming, because they are all languages on JVM, Scala and Java can be interoperable, Java programming interface is actually the encapsulation of Scala.
3) Python. Now Spark also provides Python programming interface, Spark uses py4j to achieve the interoperation between python and java, thus realizing the use of python to write Spark programs. Spark also provides pyspark, a python shell for Spark, which allows you to write Spark programs in Python interactively.
Compared with MapReduce,Spark, which has the advantages of memory-based computing and the ability to read and write data in any format directly on Hadoop, batch processing is more efficient and has lower latency.
Thank you for reading, the above is the content of "how many programming languages are supported by Spark". After the study of this article, I believe you have a deeper understanding of the programming languages supported by Spark, and the specific use needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.