Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

The detailed process of developing spark by eclipse

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly explains "the detailed process of developing spark by eclipse". Interested friends may wish to have a look. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn "the detailed process of developing spark by eclipse".

First, build an environment

Eclispe installs the scala-ide plug-in

Second, read es and mysql

First add pom:

4.0.0 test test 0.0.1-SNAPSHOT spark 2.11.8 2.2.0 2.11 18.0 junit junit 3.8.1 test org.apache.spark spark-core_$ {spark.artifactId.version} ${spark.version} Org.apache.spark spark-sql_$ {spark.artifactId.version} ${spark.version} org.scala-lang scala-compiler ${scala.version} Provided com.alibaba fastjson 1.2.29 org.elasticsearch elasticsearch- Spark-20_$ {spark.artifactId.version} 6.2.0 compile log4j-over-slf4j org.slf4j Mysql mysql-connector-java 5.1.6 org.scala-lang Scala-library ${scala.version} org.slf4j slf4j-api 1.6.4 Org.slf4j slf4j-log4j12 1.7.25 org.apache.maven.plugins maven-compiler-plugin 3.6.1 1.8 1.8 Net.alchim31.maven scala-maven-plugin 3.2.2 org.apache.maven.plugins maven-jar-plugin 3.0.2 true Lib/ spark.example.Main Org.apache.maven.plugins maven-dependency-plugin 3.0.0 Package copy-dependencies ${project.build.directory} / lib

Then write the main function:

Package testimport org.apache.spark.sql.SparkSessionimport org.apache.spark.sql.Rowimport org.apache.spark.sql.Datasetimport java.util.Propertiesobject querySql {def main (args: Array [String]): Unit = {/ / read mysql data: val spark = SparkSession.builder (). AppName ("Java Spark MYSQL basic example") .master ("local") .config ("es.nodes", "127.0.0.1") .config ("es.port") "9200") .config ("es.mapping.date.rich", "false") / / the date type is not resolved. GetOrCreate () val url = "jdbc:mysql://127.0.0.1:3306/test?useUnicode=true&characterEncoding=utf8" val table = "sys_user" Val props = new Properties () props.setProperty ("dbtable", table) / / set table props.setProperty ("user", "root") / / set user name props.setProperty ("password", "123456") / / set password / / val df = spark.read.jdbc (url, table) Props) / / df.show () / / add filter criteria / / val filter = df.filter (col ("TABLE_ID") .gt ("10")) / / System.out.println ("mysql count:" + filter.count ()); val esRows = spark.read.format ("org.elasticsearch.spark.sql"). Load ("visitlog/_doc") / / esRows.show () esRows.createOrReplaceGlobalTempView ("table1") / / val subDf = spark.sql ("SELECT userId,ip,createTime,createTime2 FROM global_temp.table1") val subDf = spark.sql ("SELECT userId,count (userId) FROM global_temp.table1 group by userId") subDf.show (); spark.close ();}}

III. Package execution

Packaging command: mvn clean scala:compile package

Execute the command: java-Djava.ext.dirs=lib-cp test-0.0.1-SNAPSHOT.jar test.querySql

At this point, I believe you have a deeper understanding of "the detailed process of developing spark by eclipse". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report