In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
-- hbase shell filter-- create 'test1',' lf', 'sf'-- lf: column family of LONG values (binary value)-- sf: column family of STRING values-- a user (userX), at what time (tsX), as rowkey-- to what product (value:skuXXX), what actions have been done as the column name, for example, C1: click from homepage; c2: click from ad; S1: search from homepage B1: buyput 'test1',' user1 | ts1', 'sf:c1',' sku1'put 'test1',' user1 | ts2', 'sf:c1',' sku188'put 'test1',' user1 | ts3', 'sf:s1',' sku123'put 'test1',' user2 | ts4', 'sf:c1',' sku2'put 'test1',' user2 | ts5', 'sf:c2',' sku288'put 'test1',' user2 | ts6', 'sf:s1' 'sku222'scan' test1', FILTER= > "ValueFilter (=, 'binary:sku188')" scan' test1', FILTER= > "ValueFilter (=, 'substring:188')" scan' test1', FILTER= > "ValueFilter (=, 'substring:88')" scan' test1', FILTER= > "ColumnPrefixFilter ('c2') AND ValueFilter (=,' substring:88')" scan 'test1', FILTER= > "ColumnPrefixFilter (' s') AND (ValueFilter (=, 'substring:123') OR ValueFilter (=,' substring:222'))" scan 'test1' FILTER= > "FirstKeyOnlyFilter () AND ValueFilter (=, 'binary:sku188') AND KeyOnlyFilter ()" scan' test1', FILTER= > "PrefixFilter ('user1')" scan' test1', {STARTROW= > 'user1 | ts2', FILTER= > "PrefixFilter (' user1')"} scan 'test1', {STARTROW= >' user1 | ts2', STOPROW= > 'user2'} import org.apache.hadoop.hbase.filter.CompareFilterimport org.apache.hadoop.hbase.filter.SubstringComparatorimport org.apache.hadoop.hbase.filter.RowFilterscan' test1' {FILTER = > RowFilter.new (CompareFilter::CompareOp.valueOf ('EQUAL'), SubstringComparator.new (' ts3'))} import org.apache.hadoop.hbase.filter.RegexStringComparatorput 'test1',' user2 | err', 'sf:s1',' sku999'scan 'test1', {FILTER = > RowFilter.new (CompareFilter::CompareOp.valueOf (' EQUAL')) RegexStringComparator.new ('^ user\ d +\ | ts\ dlegs')} import org.apache.hadoop.hbase.filter.CompareFilterimport org.apache.hadoop.hbase.filter.SingleColumnValueFilterimport org.apache.hadoop.hbase.filter.SubstringComparatorimport org.apache.hadoop.hbase.util.Bytesscan 't1cards, {COLUMNS = > 'family:qualifier', FILTER = > SingleColumnValueFilter.new (Bytes.toBytes (' family'), Bytes.toBytes ('qualifier'), CompareFilter::CompareOp.valueOf (' EQUAL')) SubstringComparator.new ('somevalue')} put' test1', 'user1 | ts9',' sf:b1', 'sku1'scan' test1', FILTER= > "ColumnPrefixFilter ('b1') AND ValueFilter (=,' binary:sku1')" scan 'test1', {COLUMNS = >' sf:b1', FILTER= > SingleColumnValueFilter.new (Bytes.toBytes ('sf'), Bytes.toBytes (' b1'), CompareFilter::CompareOp.valueOf ('EQUAL')) Bytes.toBytes ('sku1'))}-- binary value-- org.apache.hadoop.hbase.util.Bytes.toString ("Hello HBase" .to _ java_bytes) org.apache.hadoop.hbase.util.Bytes.toString ("\ x48\ x65\ x6c\ x6c\ x6f\ x20\ x48\ x42\ x61\ x73\ x65" .to _ java_bytes)-- user userX As rowkey, his various devices (brwoser, app, pc) as column names The corresponding cookie_id is used as a value (long integer variable) put 'test1',' user1', 'lf:browser1', "\ X00\ x02" put' test1', 'user1',' lf:app1', "\ X00\ X00 F" put 'test1',' user1', 'lf:app2', "\ X00\ x00\ x10" put' test1' 'user2',' lf:app1', "\ X00\ x00\ x11" put 'test1',' user2', 'lf:pc1', "\ X00\ x00\ x12" scan' test1', STOPROW= > 'user2', FILTER= > "(ColumnPrefixFilter (' app') AND ValueFilter (>, 'binary:\ X00\ x0F'))" scan' test1', LIMIT = > 10 FILTER= > "(ColumnPrefixFilter ('app') AND ValueFilter (>,' binary:\ X00\ x00\ x0F'))" alter 'test1', NAME = >' cf', METHOD = > 'delete'alter' test1', 'delete' = >' cf'alter 'test1', NAME = >' cf'-- user userX As rowkey, when (timestamp) as the column name What page's id is visited as value:page_id (integer variable) put 'test1',' user1', 'cf:1399999999', "\ x00\ x09" put' test1', 'user1',' cf:1400000000', "\ x00\ x00\ x08" put 'test1',' user1', 'cf:1400000001', "\ x00\ x00\ x07" put' test1', 'user1',' cf:1400000002' "\ x00\ x00\ x20\ xFB" put 'test1',' user2', 'cf:1500000000', "\ x00\ x11" put' test1', 'user2',' cf:1500000001', "\ x00\ x20\ xFC"-- hive hbase mapping-- CREATE EXTERNAL TABLE user_app_cookie_list (username STRING, app1_cookie_id BIGINT, app2_cookie_id BIGINT) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'WITH SERDEPROPERTIES ("hbase.columns.mapping" = ": key" Lf:app1#b, lf:app2#b ") TBLPROPERTIES (" hbase.table.name "=" test1 ") Select * from user_app_cookie_list;-- hive hbase mapping cf with binary-http://www.abcn.net/2013/11/hive-hbase-mapping-column-family-with-binary-value.htmlCREATE EXTERNAL TABLE ts_string (username STRING, visits map) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'WITH SERDEPROPERTIES ("hbase.columns.mapping" = ": key, cf:#s:b") TBLPROPERTIES ("hbase.table.name" = "test1") CREATE EXTERNAL TABLE ts_int (username STRING, visits map) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'WITH SERDEPROPERTIES ("hbase.columns.mapping" = ": key, cf:#s:b") TBLPROPERTIES ("hbase.table.name" = "test1") CREATE EXTERNAL TABLE ts_int_long (username STRING, visits map) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'WITH SERDEPROPERTIES ("hbase.columns.mapping" = ": key, cf:#s:b") TBLPROPERTIES ("hbase.table.name" = "test1"); select * from ts_intlateral view explode (visits) t as ts, page;select username, ts, page_id from ts_intlateral view explode (visits) t as ts, page_id Select username, pos, ts, page_id from ts_intlateral view posexplode (visits) t as pos, ts, page_id;username pos ts page_iduser1 1 1399999999 9user1 2 1400000000 8user1 3 1400000001 7user1 4 1400000002 8443user2 1 150000000000 17user2 2 1500000001 8444select username, from_unixtime (ts), page_id from ts_intlateral view explode (visits) t as ts, page_id
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.