In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)05/31 Report--
This article will explain in detail what is the solution to quickly import sqlserver into large-capacity csv. The content of the article is of high quality, so the editor shares it for you to do a reference. I hope you will have a certain understanding of the relevant knowledge after reading this article.
Preface
Answer a question in the forum, import csv data, and add a new column date datetime to the imported data. It is required to complete the import of 200w rows of data within 10 seconds. Share the ideas and methods of solving the problem
Analysis.
Generally speaking, Bulk insert is a little faster than BCP, so we choose Bulk insert. The proposed solution is to import the data into the temporary table of sql server, and then insert the target table. The specific statements are as follows:
Bulk insert test07232 from'D:\ 2017-7-22.csv 'WITH (FIELDTERMINATOR =',', ROWTERMINATOR ='/ n') SELECT *, GETDATE () AS Date INTO ttt FROM test07232
However, when the csv provided by him is imported, the following error is prompted
Message 4866, level 16, status 1, line 1 bulk load failed. The second column of line 1 in the data file is too long. Verify that field and row Terminators are specified correctly. Message 7399, level 16, status 1, line 1 OLE DB provider "BULK" of link server "(null)" reported an error. The provider did not give any information about the error.
Message 7330, level 16, status 2, line 1 cannot be extracted from the OLE DB provider "BULK" of the linked server "(null)".
This is because the line Terminator is not recognized. Use notepad++ to open the csv file and select to display the line end number in the view.
You can see that the newline character of the file is LF
For normal csv problems, CRLF is used as the newline character by default.
Therefore, the above bulk insert statement does not execute properly.
Solve
1. The first thing that comes to mind is to modify the source of the data so that the source produces normal data, but the source data is difficult to modify 2. Use the program, write c # processing, too time-consuming 3. Finally found the right way.
Bulk insert test07232 from'D:\ 2017-7-22.csv 'WITH (FIELDTERMINATOR =',', ROWTERMINATOR = '0x0a') SELECT *, GETDATE () AS Date INTO ttt FROM test07232
In the end, all of them are on SSD, and the import takes 2s. Production of official table 1s. The whole process was completed in 3 seconds.
What is the solution to quickly import sqlserver into large-capacity csv is shared here. I hope the above content can be helpful to you and learn more knowledge. If you think the article is good, you can share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.