In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-21 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article mainly introduces "spark reading hbase data instance code". In daily operation, I believe that many people have doubts about spark reading hbase data instance code. Xiaobian consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful to answer the doubts of "spark reading hbase data instance code". Next, please follow the editor to study!
Package hgs.spark.hbase// https://blog.csdn.net/mlljava1111/article/details/52675901import org.apache.spark.SparkConfimport org.apache.spark.SparkContextimport org.apache.hadoop.hbase.HBaseConfigurationimport org.apache.hadoop.hbase.mapreduce.TableInputFormatimport org.apache.hadoop.hbase.client.Scanimport org.apache.hadoop.hbase.filter.FilterListimport org.apache.hadoop.hbase.filter.FilterList.Operatorimport org.apache.hadoop.hbase.filter.RowFilterimport org.apache.hadoop.hbase.filter.RegexStringComparatorimport org.apache. Hadoop.hbase.filter.CompareFilter.CompareOpimport org.apache.hadoop.hbase.protobuf.ProtobufUtilimport org.apache.hadoop.hbase.util.Base64import org.apache.hadoop.hbase.util.Bytesimport org.apache.hadoop.hbase.filter.LongComparatorobject HbaseToSpark {def main (args: Array [String]): Unit = {/ / System.setProperty ("spark.serializer") "org.apache.spark.serializer.KryoSerializer") Val conf = new SparkConf conf.set ("spark.serializer", "org.apache.spark.serializer.KryoSerializer") conf.setMaster ("local"). SetAppName ("hbasedata") val context = new SparkContext (conf) / / hbase configuration val hconf = new HBaseConfiguration hconf.set ("hbase.zookeeper.quorum", "bigdata00:2181,bigdata01:2181,bigdata02:2181") hconf.set ("hbase.zookeeper.property.clientPort") "2181") hconf.set (TableInputFormat.INPUT_TABLE, "test") val scan = new Scan / / start and end of scanned table rowkey scan.setStartRow ("1991" .getBytes) scan.setStopRow ("3000" .getBytes) / / val list = new FilterList (Operator.MUST_PASS_ALL) / / val filter1 = new RowFilter (CompareOp.GREATER_OR_EQUAL,new LongComparator (1991)) / / val filter2 = new RowFilter (CompareOp.LESS_OR_EQUAL) New RegexStringComparator ("3000*") / / list.addFilter (filter1) / / list.addFilter (filter2) / / scan.setFilter (list) / / add scan hconf.set (TableInputFormat.SCAN, convertScanToString (scan)) val hrdd = context.newAPIHadoopRDD (hconf, classOf [TableInputFormat], classOf [org.apache.hadoop.hbase.io.ImmutableBytesWritable] ClassOf [org.apache.hadoop.hbase.client.Result]) val resultrdd = hrdd.repartition (2) / / print result resultrdd.foreach {case (_, value) = > {val key = Bytes.toString (value.getRow) val name = Bytes.toString (value.getValue ("cf1" .getBytes, "name" .getBytes) val age = Bytes.toString (value.getValue ("cf1" .getBytes) "age" .getBytes) println ("rowkey:" + key+ "+" name: "+ name+"+" age: "+ age)}} context.stop ()} def convertScanToString (scan: Scan) = {val proto = ProtobufUtil.toScan (scan) Base64.encodeBytes (proto.toByteArray)}} The study on "spark reads the data instance code of hbase" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.