Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Visual Studio Code added a pair of SQL Serv

2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

Recently, Microsoft announced that Visual Studio Code supports SQL Server 2019 big data cluster PySpark development and query. It provides Azure Data Studio with complementary functions for data engineers to write and produce PySpark jobs after data exploration and testing by data scientists. Visual Studio Code Apache Spark and Hive extensions allow you to enjoy cross-platform and enhanced lightweight Python editing capabilities that cover scenarios related to Python writing, debugging, Jupyter Notebook integration, and notebook computers similar to interactive queries.

With Visual Studio Code extensions, you can enjoy native Python programming experiences, such as linting, debugging support, language services, and so on. You can run the current line, selected lines of code, or all for the PY file, import and export an .ipynb notebook, and execute query notebooks like "run cell", "run above" or "run below". You can enjoy the same interactive experience as a laptop, including your source code and markup comments, as well as run results and output You can delete unwanted parts, enter comments, or type other code in the interactive results window. In addition, you can visualize the results in a graphical format through matplotlib, such as Jupyter Notebook. Integration with the SQL Server 2019 big data cluster enables you to quickly submit PySpark batch jobs to the big data cluster and monitor job progress.

Main functional highlights

1. You can link to SQL Server. This toolkit enables you to connect PySpark jobs and submit them to the SQL Server 2019 big data cluster.

2. Python editing. Develop PySpark applications with native Python authoring support, such as IntelliSense, automatic formatting, error checking, and so on.

3. Jupyter Notebook integration. Import and export .ipynb files.

4. PySpark interactive. Run selected lines of code, or run notebooks such as PySpark cells, and interactive visualization.

5. PySpark batch submits the PySpark application to the PySpark 2019 big data cluster.

6. PySpark surveillance. Integrate with Apache Spark History Server to view job history, debug, and diagnose Spark jobs.

How to install or update

First, install Visual Studio Code, and then download Mono4.2.x for Linux or Mac. Then go to the Visual Studio Code extension repository or the Visual Studio Code marketplace and search Spark for the latest Apache Spark and Hive tools.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report