Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Actual combat: using KNIME to build Spark Machine learning model 1: building development environment

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

1. Knime Analytics installation

Download the appropriate version of https://www.knime.com/downloads from the official website

Extract https://www.knime.com/installation-0 from the downloaded installation package in the installation path

The following figure shows the welcome page after the launch of knime

To interact with the spark set × ×, you need to install KNIME ®Extension for Apache Spark in Knime. And install Spark Job Server on the boundary node of the Hadoop cluster or on the node that can perform spark-submit. The architecture diagram is as follows:

2. KNIME ®Extension for Apache Spark installation

Click File- > Install KNIME extensions in KNIME Analytics to select KNIME Big Data Extensions, and click Next to install.

3. SPARK JOB SERVER installation

The following steps take Centos 6.5 + CDH 5.7 as an example

3.1download spark job server

$wget http://download.knime.org/store/3.5/spark-job-server-0.6.2.3-KNIME_cdh-5.7.tar.gz

3.2 login as root or su root

3.3 installation

# LINKNAME=spark-job-server

# useradd-d / opt/$ {LINKNAME} /-M-r-s / bin/false spark-job-server

# su-l-c "hdfs dfs-mkdir-p / user/spark-job-server; hdfs dfs-chown-R spark-job-server / user/spark-job-server" hdfs

# cp spark-job-server-0.6.2.3-KNIME_cdh-5.7.tar.gz / opt

# cd / opt

# tar-xvf spark-job-server-0.6.2.3-KNIME_cdh-5.7.tar.gz

# ln-s spark-job-server-0.6.2.3-KNIME_cdh-5.7 ${LINKNAME}

# chown-R spark-job-server:spark-job-server ${LINKNAME} spark-job-server-0.6.2.3-KNIME_cdh-5.7

3.4 Boot start

# ln-s / opt/$ {LINKNAME} / spark-job-server-init.d / etc/init.d/$ {LINKNAME}

# chkconfig-- levels 2345 ${LINKNAME} on

3.5 Edit environment.conf

Set up master, for example

Master = "spark://ifrebdplatform1:7077"

Set Default settings for Spark contexts: context-settings

3.6 Edit Settings settings.sh

Set SPARK_HOME. This example is correct by default and will not be changed.

Set LOG_DIR, if you do not use the default directory

3.7Editing log4j-server.properties according to preference

3.8 start spark job server

/ etc/init.d/$ {LINKNAME} start

3.9 add create spark context node test link in knime

Right-click the create spark context node and click Execute to execute

Right-click the create spark context node, click Spark Context to view the result

It is not finished to be continued.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report