In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
Maven:3.3.9
Jdk:java version "1.8.051"
Spark:spark-1.6.1.tgz
Scala:2.11.7
If the scala version is 2.11.x, execute the following script
. / dev/change-scala-version.sh 2.11
Spark is compiled with 2.10.5 of scala by default
The compilation command is as follows:
Mvn-Pyarn-Phadoop-2.6-Dhadoop.version=2.6.0-Phive- Phive-thriftserver-Dscala-2.11-DskipTests clean package
The red part is the required dependency of the spark-sql link hive, and the version of the specified scala
Note: the hive-site.xml file needs to be placed in the $SPARK_HOME/conf directory, otherwise the table cannot be found.
Using spark-sql to access hive
Package com.infra.codelab.spark.hiveimport org.apache.spark.SparkConfimport org.apache.spark.SparkContextobject HiveTest {val conf = new SparkConf () val sc = new SparkContext (conf) def main (args: Array [String]): Unit = {val sqlContext = new org.apache.spark.sql.hive.HiveContext (sc) sqlContext.sql ("SELECT line FROM filecontent"). Collect (). Foreach (println)}}
Submit tasks:
Spark-submit-class com.infra.codelab.spark.hive.HiveTest-master spark://localhost:7077 / home/xiaobin/test/spark/wordcount-0.0.1-SNAPSHOT.jar
Spark-sql:
Export SPARK_CLASSPATH=$SPARK_CLASSPATH:/home/xiaobin/soft/apache-hive-0.14.0-bin/lib/mysql-connector-java-5.1.35.jarspark-sql-master spark://xiaobin:7077spark-sql > select count (*) from filecontent;483 Time taken: 3.628 seconds, Fetched 1 row (s)
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.