Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use SQL to migrate data tables to data warehouse

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)05/31 Report--

This article introduces the knowledge of "how to use SQL to migrate data tables to the data warehouse". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!

Demand background

Recently, the company intends to focus on combing the data of several major business systems, hoping to centralize the data from each business system into the data warehouse. There are more than 5000 data sheets in total, but fortunately, the amount of business data is not as large as e-commerce, only dozens of gigabytes.

Demand analysis

In fact, this requirement is very simple, that is, put these more than 5000 tables in different databases in one place. We should pay attention to the following points:

1. The data comes from different types of databases, including SQL Server,MySQL and Oracle.

2. There is a large amount of data in the table, so it is certainly not realistic to write the query code one by one.

3. Follow-up maintenance of data warehouse

Proposal of scheme

Since the amount of data is not very large, I intend to use DBLINK to extract data from different libraries to the data warehouse.

Scheme thinking

1. Create different DBLINK

Data warehouse We are currently using a SQL Server server, the overall performance is OK. However, the database types of business systems are different, and there are different requirements when creating a new DBLINK:

A. The business database for SQL Server can be created directly on the server.

B. The business database for MySQL and Oracle needs to be configured using ODBC intermediate components.

2. Query all table names in the database

Each business database is a full database extraction, so you first need to find all the tables in these databases. Here we take SQL Server as an example to find all the tables in the database.

SELECT NAME FROM SYSOBJECTS WHERE TYPE='U'

The above code can query the table names of all the tables in the current library. I tested it here on my computer at home and showed you the screenshot:

You can also try it on your own computer and find out.

The code for Oracle to obtain the name of the user table is as follows:

SELECT * FROM USER_TABLES

The code for MySQL to obtain the name of the user table is as follows:

Select table_name from information_schema.tables where table_schema='db_name'

3. Circular extraction of data

After we have completed the above two steps, we can begin to extract data from each business system in a loop. Here we need to write a cursor to loop. The specific code is as follows:

DECLARE @ TableName varchar (50), @ Sql varchar (500)-defines two variables, one to store the table name and one to store the insert statement DECLARE cursor_variable CURSOR FOR-defines a cursor and inserts all table names of the target table into the cursor select name from [192.168.0.39] .[ test] .sysobjects where xtype='u' order by name OPEN cursor_variable-- Open the cursor FETCH NEXT FROM cursor_variable INTO @ TableName-- get the data in the cursor and insert it into the variable WHILE @ @ FETCH_STATUS=0-- execute in a loop, BEGIN SET @ Sql='select * into dbo.'+@TableName + 'from [192.168.0.39]. [test]. [dbo].' + @ TableName Exec @ Sql FETCH NEXT FROM cursor_variable INTO @ TableName END CLOSE cursor_variable-- close the cursor DEALLOCATE cursor_variable -- release cursors

At present, you are only testing the code, and you can continue to optimize the performance later on.

4. Set scheduled tasks

Once the code has been written, it is certainly impossible to execute it manually every day, at this time we can use the scheduled tasks of the database, which I have mentioned in previous articles. "the automation of database tasks is actually very simple, a brief introduction to JOB."

Let's put the code into a scheduled task and let it execute at 1: 00 a. M. every day.

Summary

This method is feasible when dealing with a small amount of data, if the amount of data is large, there will be a greater risk in performance. Let's review what has been done:

1. Create DBLINK for different databases

2. Query all table names of each database

3. Insert into the data warehouse using a cursor loop

4. Set the cursor above for scheduled task execution

There may be problems in every step, but as long as these problems are solved, the matter will be solved.

Feel good, remember to forward it and share it with more people.

This is the end of the introduction of "how to use SQL to migrate data tables to the data warehouse". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report