In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
1. Install the dependency package
Yum install rsync gcc openldap-develpython-ldapmysql-develpython-devel python-setuptools python-simplejson sqlite-devellibxml2-devel libxslt-devel cyrus-sasl-devel opnldap openldap24-libs openldap-clients openldap-devel openssl-devel
Easy_install ipdb ipython nose coverage nosetty threadframe werkzeug windmill
2. Download the cdh version of hue
Wget http://archive-primary.cloudera.com/cdh6/cdh/5/hue-3.7.0-cdh6.4.2.tar.gz
3. Decompress and install
Tar zxvf hue-3.7.0-cdh6.4.2.tar.gz
Cd hue-3.7.0-cdh6.4.2
Make install PREFIX=/hue HADOOP_HOME=/home/hadoop/hadoop
Ln-s / hue/hue/desktop/libs/hadoop/java-lib/hue-plugins-3.7.0-cdh6.4.2.jar / home/hadoop/hadoop/lib
4. Create hue-related mysql data
Mysql-u root-p
Create database hue
Grant all on hue.* to 'hue'@'localhost' identified by' Ab1234567890'
Grant all on hue.* to 'hue'@'hd1' identified by' Ab1234567890'
Grant all on hue.* to 'hue'@'%' identified by' Ab1234567890'
5. Synchronize hue initialization data
/ hue/hue/build/env/bin/hue migrate
/ hue/hue/build/env/bin/hue syncdb
6. Configuration file
Vim / hue/hue/desktop/conf/hue.ini
[desktop]
Secret_key=XXXX11112222
Http_host=hd1 (can write ip)
Http_port=7777
Time_zone=Asia/Shanghai
Server_user=hadoop
Server_group=hadoop
Default_user=hadoop
Default_hdfs_superuser=hadoop
Default_site_encoding=utf-8
[hadoop]
Fs_defaultfs=hdfs://hd1:9000
Webhdfs_url= http://hd1:14000/webhdfs/v1
Hadoop_conf_dir='/home/hadoop/hadoop/etc/hadoop'
[beeswax]
Hive_server_host=hd1
Hive_server_port=10000
Hive_conf_dir=/usr/local/spark/spark-1.3.0-bin-hadoop2.3/conf
Other features can not be configured if not in use.
Beeswax is hive, and if you want to use spark-sql, you must first make sure that the thriftserver of spark is working properly. Then hue will send the hiveql directly to sparksql's thriftserver through beeswax.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.