In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
Editor to share with you how to use Transform in Flink, I believe most people do not know much about it, so share this article for your reference, I hope you can learn a lot after reading this article, let's go to know it!
Group aggregation String path = "E:\ GIT\\ flink-learn\\ flink-learn\\ telemetering.txt"; StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment (); TupleTypeInfo typeInfo = new TupleTypeInfo (Types.STRING, Types.DOUBLE, Types.LONG); TupleCsvInputFormat tupleCsvInputFormat = new TupleCsvInputFormat (new Path (path), typeInfo); DataStreamSource dataStreamSource = env.createInput (tupleCsvInputFormat, typeInfo); / / or DataStreamSource dataStreamSource = env.readFile (tupleCsvInputFormat, path) SingleOutputStreamOperator operator = dataStreamSource .filter (Objects::nonNull) / / .map () / / .flatMap () / / .keyBy (0) .keyBy (tuple-> tuple.f0) .minBy (1); / / .min () / / .max (1) / / .maxBy (1, false); / / .sum (1); / / .reduce (); / / .process (); operator.print (). SetParallelism (1); env.execute (); diversion / confluence String path = "E:\\ GIT\\ flink-learn\\ flink-learn\\ telemetering.txt" StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment (); PojoTypeInfo typeInfo = (PojoTypeInfo) Types.POJO (TelemeterDTO.class); PojoCsvInputFormat inputFormat = new PojoCsvInputFormat (new Path (path), typeInfo, new String [] {"code", "value", "timestamp"}); DataStreamSource dataStreamSource = env.createInput (inputFormat, typeInfo) / / split SplitStream splitStream = dataStreamSource. Split (item-> {if (item.getValue () > 100) {return Collections.singletonList ("high");} return Collections.singletonList ("low");}) DataStream highStream = splitStream.select ("high"); DataStream lowStream = splitStream.select ("low"); / / confluent ConnectedStreams connectedStreams = lowStream.connect (highStream); / / DataStream unionDataStream = lowStream.union (highStream) / / requires consistent type SingleOutputStreamOperator operator = connectedStreams .map (new CoMapFunction () {@ Override public Tuple3 map1 (TelemeterDTO value) {return Tuple3.of (value.getCode (), value.getValue (), value.getTimestamp () } @ Override public Tuple3 map2 (TelemeterDTO value) {return Tuple3.of (value.getCode (), value.getValue (), value.getTimestamp ();}}); operator.print (); env.execute (); UDF function, which provides underlying support
MapFunction
FilterFunction
ReduceFunction
ProcessFunction
SourceFunction
SinkFunction
Rich function
Rich functions contain life cycle, and context-sensitive information, such as
Open () can establish a database connection at the beginning of operator creation.
Close () closes the resource before the end of operator life
The above is all the contents of the article "how to use Transform in Flink". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.