SQLServer massive data imported fastest way

zhaozj2021-02-12  149

Recently, the database analysis of a project is to achieve the introduction of massive data. It is to import 2 million data into SQLServer. If you write it with a normal Insert statement, I am afraid that there is no time to make an hour, first Considering the use of BCP, this is based on command line, and the user is very poor. It is actually unlikely; finally decides to use Bulk INSERT statement, Bulk INSERT can also achieve large data volume, and can be programmed Realization, the interface can be very friendly, it is also very high: importing 1 million data in less than 20 seconds, I am afraid there is no right at speed. But use this way, there are several shortcomings: 1. Table 2. You need to accept data. You will generate a lot of logs. You can overcome, and if you are willing to sacrifice a little speed, you can also do more precise control, and even control the insertion of each line. In the case where the log in which to occupy a large amount of space, we can take the log mode for importing the database before importing for the large-capacity log recovery mode, so that the log is logged, and then restore the original database logging mode after the import is imported. . Specific one of the statement we can write: Alter Database TaxiSet Recovery Bulk_logged

Bulk INSERT TAXI..Detail from 'E: /out.txt'with (?? DataFiletyPE =', ?? Fieldterminator = ',',?? Rowterminator = '/ n',? Tablock

Alter Database TaxiSet Recovery Full

This statement will export data files from E: /out.txt to the Detail table of database TAXI.

转载请注明原文地址:https://www.9cbs.com/read-6747.html

New Post(0)