In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-07 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
1. Key-free login configuration: ssh-keygencd .sshtouch authorized_keyscat id_rsa.pub > authorized_keyschmod 600 authorized_keys2. Environment tools 2.1 Environment
System urbuntu jdk 1.7.0_79
scala 2.10.4
hadoop 2.6.0
spark 1.6.2
2.2 Packaging tools
IDEA + sbt1.2 Packaging Tool
3. Package 3.1 Install Plugins
To install scala plugin in advance, click File ->Setting ->Plugins -> Enter scala->install in the input box
Installation complete requires IDE restart
3.2 create a project
File -> New Project ->Scala -> SBT Select the appropriate version->finish
3.3 write code
build.sbt Add dependency on spark
name := "demoPro"version := "1.0"scalaVersion := "2.10.4"libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.2"
Create WordCount.scala and write the following code
import org.apache.spark. {SparkContext, SparkConf}/** * Created by Administrator on 2018/2/20. */object WordCount { def main(args: Array[String]) { val conf = new SparkConf().setAppName("wordcount") val sc = new SparkContext(conf) val input = sc.textFile("/home/dell/helloSpark.txt") val lines = input.flatMap(line => (line.split(" "))) val count = lines.map(word => (word, 1)).reduceByKey { case (x, y) => x + y } val output=count.saveAsTextFile("/home/dell/helloSparkRes")
File -> Project Structure -> Artifacts-> Click +->jar -> Second-> Specify Module and MainClass -> JAR files from libraries Select Second-> Click ok
Click Build -> Build Artifacts- Build
Generate the corresponding jar package in the out directory under the project, that is, package successfully
4. Submit Task 4.1 Start Hadoop
#Enter sbin directory
cd $Hadoop_HOME/sbin
#Start hadoop cluster
start-all.sh
4.2 Upload test files to hdfs
hadoop fs -put test.txt /test/test.txt
4.3 Upload jar package
is the same as filelize or sftp or rz -y command upload program jar
4.4 Submit Task 4.4.1 Start Master
sudo ./ start-master.sh
Visit localhost:8080 to get spark://xxx:7077
4.4.2 Start Worker
sudo ./ bin/spark-class org.apache.spark.deploy.worker.Worker spark://dell:7077
4.4.3 submit a job
sudo ./ bin/spark-submit --master spark://dell:7077 --class WordCount /home/dell/demopro.jar
5. Check whether the test program is correct 5.1. Check whether the folder is generated. Enter the file to check whether the program is correct 5.2. Enter the file to check whether the program is correct
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.