Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is the fastest way to import massive data from SQL Server?

2025-04-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)05/31 Report--

What is the fastest way to import massive SQL Server data? in view of this problem, this article introduces the corresponding analysis and solution in detail, hoping to help more partners who want to solve this problem to find a more simple and feasible method.

Recently, to do a database analysis of a project, to achieve the problem of importing a large amount of data, at most 2 million pieces of data are imported into sqlserver at a time. If you use ordinary insert statements to write out, I am afraid you will not be able to complete the task in every hour. First, consider using bcp, but this is based on the command line, which is too friendly for users and is not likely to be used. Finally, it is decided to use BULK INSERT statement to achieve, BULK INSERT can also achieve a large amount of data import, and can be realized by programming, the interface can be very friendly, its speed is also very high: importing 1 million pieces of data in less than 20 seconds, I am afraid the speed is second to none. But using this method also has several disadvantages: 1. Table 2. 1 requires exclusive acceptance of data. Will generate a large number of logs 3. The file that fetches data from it has format restrictions, but these shortcomings can be overcome relative to its speed, and you can do more precise control, or even control the insertion of each line, if you are willing to sacrifice a little speed. In the case of generating logs that take up a lot of space, we can change the database dynamically before import to the mass logging recovery model, so that the log will not be logged. Restore the original database logging method after the import. A specific statement can be written as follows: the copy code is as follows: alter database taxi set RECOVERY BULK_LOGGED BULK INSERT taxi..detail FROM'e:\ out.txt' WITH (DATAFILETYPE = 'char',FIELDTERMINATOR =',', ROWTERMINATOR ='\ n', TABLOCK) alter database taxi set RECOVERY FULL

This statement will export the data file from e:\ out.txt to the detail table of the database taxi.

The answer to the question about what is the fastest way to import massive SQL Server data is shared here. I hope the above content can be of some help to you. If you still have a lot of doubts to be solved, you can follow the industry information channel to learn more about it.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report