In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Spark is a distributed computing framework based on memory. It has the advantages of high performance, high efficiency and scalability, fault tolerance and so on.
Today, let's talk about spark stream computing, in fact, it is not entirely real-time stream computing, it is a kind of quasi-real-time flow computing.
The picture above explains
Running environment: spark environment under linux environment is required.
The centOS 6.5x64 used in this example needs to install a nc plug-in because it needs to use the TCP protocol to transfer data.
Installation method: yum install ncxxx or mount CD installation
After installation, the nc-lk 9999 port can be freely specified, preferably more than 1024.
Post the code below
Java version of
Import java.util.Arrays;import java.util.List;import org.apache.spark.SparkConf;import org.apache.spark.api.java.function.FlatMapFunction;import org.apache.spark.api.java.function.Function2;import org.apache.spark.api.java.function.PairFunction;import org.apache.spark.streaming.Durations;import org.apache.spark.streaming.api.java.JavaDStream;import org.apache.spark.streaming.api.java.JavaPairDStream;import org.apache.spark.streaming.api.java.JavaReceiverInputDStream Import org.apache.spark.streaming.api.java.JavaStreamingContext;import com.google.common.base.Optional;import scala.Tuple2;public class SparkDemo {public static void main (String [] args) {SparkConf conf=new SparkConf () .setAppName ("sparkDemo2") .setMaster ("local [3]"); JavaStreamingContext jsc=new JavaStreamingContext (conf,Durations.seconds (5)) / / to use stateful operators, checkpoint is required to handle jsc.checkpoint ("D://chkspark"); JavaReceiverInputDStream socketTextStream=jsc.socketTextStream ("10.115.27.234", 1000); and JavaDStream wordsDstream=socketTextStream.flatMap (new FlatMapFunction () {private static final long serialVersionUID=1L) Public Iterable call (String line) throws Exception {return Arrays.asList (line.split ("));}}); JavaPairDStream wordsToPairDstream=wordsDstream.mapToPair (new PairFunction () {private static final long SerialVersionUID=1L) Public Tuple2 call (String word) throws Exception {return new Tuple2 (word, 1);}}); / * one batch corresponds to one RDD. * * / JavaPairDStream resultDstream=wordsToPairDstream.updateStateByKey (new Function2 () {private static final long serialVersionUID=1L; public Optional call (List values, Optional state) throws Exception {Integer oldValue=0 / / default old value is 0 if (state.isPresent ()) {oldValue=state.get ();} for (Integer value:values) {oldValue+=value } return Optional.of (oldValue);}}); / / print the result resultDstream.print (); jsc.start (); jsc.awaitTermination ();}}
Program testing: input any string from nc on Linux side, and spark streaming will make statistics on the input data in real time. Similar to wordcount. Unless you kill this process manually, it will run all the time. Because its principle is that, like the flow of tap water, it is a series of data streams.
Show the running results:
You can also write the same program in scala with less code.
A deep understanding of the architectural principles of spark streaming is required.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.