Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

A great shock! Do not need hadoop environment to run scala and pyspark programs

2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

Databricks officials describe the advantages of Databricks as follows:

Cloud platforms like Databricks provide an integrated, hosted solution that removes the three obstacles that enterprises face in adopting Spark and ensuring the success of the big data project. We provide you with a fully managed and tuned Spark cluster, which is strongly supported by a group of experts who develop Spark. Our platform provides you with an interactive work area for exploration, visualization, collaboration and distribution. If you are ready to enter the production environment, you can start the task with one click of the mouse. We will build the infrastructure automatically.

In addition, we provide a rich set of API to programmatically access the platform, which also allows users to seamlessly integrate third-party applications.

We civilian players want to practice scala,pyspark, suffering from the computer configuration is not good, did not install the hadoop environment, the company has no environment to use, wait for the pain point.

It doesn't matter, you can apply for Databricks space for free, and you don't need * big * to run scala code.

It's just that when you sign up for an account, you need a gmail email address to ok.

The registration address is as follows

Https://community.cloud.databricks.com/login.html

Select Community Edition.

After entering, write down your name and company name abc casually.

Note: here is a registration verification code that may require * big * ha ~

If you ask the author for this, I can only reply you like this -:

After the account is completed, start creating a scala or python script (pyspark): Create Notebook:

You can create four kinds of notebook for scala,python,sql,R to run spark programs.

As shown in the picture, it is very sweet to give the free cluster (6GB of memory, spark2.4,scala2.11) so that we can easily experience it in the cloud.

Spark. It is highly recommended for users who use laptops and computers with low configuration.

And the notebook created in it can be exported & imported, friends encounter grammatical problems (all kinds of difficult and complicated diseases), export notebook

Consult the author samir and the group leader in WeChat group.

Here are two simple scala statements to test:

1. Common if-else control statements

2. Scala calls linux commands wget to download files, ls,pwd and other commands.

one

After the work is over, this is the end of the sharing.

Attached is the official operation manual:

Https://docs.databricks.com/getting-started/index.html

Then paste our scala& spark Wechat group QR code:

Welcome to communicate ~

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report