Main topic: Restore database process by logExplorer: ITBOY3000 (BitBoy) equal level: credit value: 100 community: MS-SQL Server basic class problem point: 100 replies: 26 Posted: 2004-04-12 13: 56: 05z
Original post address: http://expert.9cbs.net/expert/topic/2931/2931668.xml Due to the forum, it is not allowed to recover more than 30 posts, so the new opening first thank you Lynx1111, Leeboyan (baby), Realgz (Realgz The enthusiastic help, and also thank the islandNet (e bunny), zhijiao (I am a pig younger brother), ShuiICHANGLIU (life will be more and better!), Ghostzxp (ghost), Terencegan (New Beijing New Olympics) ), Progress99 (such as a thin ice), Outwindows, Benimarunikado (Peng Jianjun) ... The enthusiasm of others is currently completely recovered because the database covered by error is basically completely recovered. After the restoration, the website is re-running, welcome you to be a guest [address http://www.gbq.cn]
Now I will recover the process of the database and some issues encountered here to report, I hope to help the friends who meet the same problem:
[Cause:] When using the Data Import Guide Tool Transfer the locally to the server, forget the "Select All Objects", so the remote 140 sheets over 10 million data is overwritten (the operator is time to submit. Going to eat, so there is no cancellation in the middle), the database is not backed up.
[Recovery Process:] Usage Tool is LOGEXPLORE, (Download address: http://five.ttdown.com/l/log explorer for sql Serverv v3.21.kg.exe)
Open Log Explorer file => attach log file-> Select Server and Log File-> CONNECT-> Select Database -> Attach-> Log-> You can see Log record, click " There are many DROP TABLE commands in view ddl commands, click the "undo" button below to generate the statement of the table structure (create Table ....) Click the "Salvage" button below to generate inserted statements (Insert Into ... VALUES ....) (provided above Lynx1111)
I am using the "Salvage" in accordance with the above method to generate the INSERT statement of the deleted table, in fact, the SQL script generated by this method already contains CreateTable. The process is about 8 hours, and it feels slow, and later compared to the recovery process, this speed is not fast. The largest table and scriptures exceed 1 g.
After generating all SQL scripts, prevent the database to stop, and copy the log and .mdf file of the Date folder (afraid to destroy the log file, no backup mode backup of the database), the file size is 5.7 G
Since then start formal recovery work. Create a new database, try the script of a small table with the SQL query analyzer, there is no problem. But later discovered that the introduced SQL script file, the query analyzer was wrong. Please teach Realgz to know that Logexplorer itself has good support to the big script, so it is changed to Run SQL Script function to run the script. Sure enough, the big file can also be restored. But after starting running, it is found that the table containing NTEXT fields is recovered slowly, open a recovery script that contains NTEXT fields, discovery to use WriteText to write data. Recovery of a table of 300,000 data actually used nearly 12 hours, and there is a large number of tables in the database. In order to speed up the data, I installed LOGEXPLORER in several machines to join the recovery process, and finally after 3 days, All the tables are all almost, but the recovery process has a small amount of errors.
Next, I will guide the table of several machines into the same database, but the table restored at this time is not including the index, identification, etc., so you need to re-establish the index, identification, default, and trigger. When the primary key was established, it was found that there was data repeated. . . There is no way to delete duplicate data.
Using Select Distinct * Into T_New from T_OLD can delete duplicate data, but the table with NTEXT fields cannot use this method, and finally use delete from t_table where id in (select point from t_table a where (select count (*) From t_table a where a.id = id)> 1) Directly deleted records with repetition data
After 72 hours of effort, 99.9% of data recovery. And restore the running website on April 8th.
At this time, some users reflect that there is no way to log in. I found that there is a small part of the data loss, that is, those data of the LOGEXPLORER RM ... No way, I reuse the SQL script to open the SQL script, find these data, and find it carefully. It is found that some of these data use a large number of carries, LogExplorer cannot be identified, so the error is available.
Oh, the customer is God, there is no way, so I have to recover the user table locally. If you encounter an error, you will record the ID, then you can then test the SQL script to the query analyzer run (Query Analyzer can run)
Now established a maintenance plan, make a complete backup every week. In addition, the process of operating the database varies, preventing such incidents from appearing
[Some Harves:] 1. Carefully use the text / ntext field 2, LOGEXPLORER's script execution tool to deal with the big file is very good, but the execution process will generate eruptions for multiple carriage return 3, do not worry, go to 9CBS to find a master to help They will help you very enthusiastic