Oracle Export and Import Introduction
1. Use the Oracle Export / Import tool for Export / Import to deliver data between the database. EXPORT exports data from the database to the DUMP file in the Dump file In the DUMP file, the following is the usual use of their situation (1), the two databases transmit data the same version of the Oracle Server between the same version. Different versions. Different OSs between Oracle Server (2), used for backup and recovery of the database (3), transfer from a Schema to another, transfer from a TableSpace to another TABLESPACE
2. DUMP files Export arrived out of the binary format file, not manually edited, otherwise the data will be damaged. This file is the same format on any platform supported by Oracle, which can be universal on each platform.
The DUMP file uses an upwardly compatible mode when IMPORT is that Oralce7's DUMP file can be imported into Oracle8, but there may be problems between versions of the version.
3, Export / Import Procedure Export Export Dump file contains two basic types of data - DDL - DATA DUMP files contain all DDL statements that recreate Data Dictionary, basically readable format. But it should be noted that don't use the text editor to edit it, Oracle does not support this.
The Oracle object included in the DUMP file is divided into a Table / User / Full method, and some objects are just in the full mode (such as public synonyms, usrs, rollback segm ents, etc.)
Table Mode User Mode Full Database Mode --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- - - Table constraints Table constraints Table constraints Table triggers Table triggers All triggers Clusters Clusters Database links Database links Job queues Job queues Refresh groups Refresh groups Sequences Sequences Snapshots Snapshots Snapshot logs Snapshot logs Stored procedures Stored procedures Private synonyms All synonyms Views Views Profiles Replication catalog Resource cost Roles Rollback Segments System Audit Options System Privileges TableSpace Definitions
TableSpace Quotas User Definitions
4. When IMPORT is inserted, the Oracle has a specific order when pouring the data, possibly with the database version, but now it is. 1. Tablespaces 14. Snapshot Logs 2. Profiles 15. Job Queues 3. Users 16. Refresh Groups 4. Roles 17. Cluster Definitions 5. System Privilege Grants 18. Tables (also grants, commen ts, 6. Role Grants indexes, constraints , AUDI TING 7. Default Roles 19. Referenceial Integrity
8. TableSpace Quotas 20. PostTables Actions
9. Resource Costs 21. Synyms 10. Rollback Segments 22. Views 11. Database Links 23. Stored Procedures 12. SEQUENCES 24. Triggers, Defaults And Aud ITING 13. Snapshots
According to this order, it is mainly to solve problems that may be generated between objects. Trigger finally imports, not exciting Trigger when INSERT data to the database. There may be some states after importing, which is the Proc Edure of Invalid, which is mainly imported, which affects some database objects, and IMPORT does not recompile procedure, resulting in this, can be recompiled, can solve this problem.
5. Compatibility The IMPORT tool can handle the DUMP file exported after the version of Export 5.1.22, so you use Oracl E7's Import to handle Oracle6's dump file, push it according to the class, but oracle is much better if the version varies. Specific issues can refer to the corresponding document, such as related parameter settings, etc. (Compatible parameters)
6, the View Export you need for Export is created by catexp.sql, which is used in the Data format in the Export organization Dump file. Most View is used to collect create DDL statements, others are mainly used by Oracle developers.
These Views may differ between different Oracle versions, each version may have new features. So in the new version, the old DUMP file will have an error, which can generally execute catexp.sql to solve these problems, and solve the general steps to solve the backward compatible problem as follows:
Exporting the version of the database is older than the target database: - Export Dump file in the target database that needs to be imported - Export Dump files with old export - Import new catexps in the database - Perform new catexp in the database . SQL to restore this version of Export View
Exporting the database's version of the target database is new: - Perform new catexp.sql in the target database that needs to be imported - Export DUMP files with new export - Import the new import into the database - perform old catexp in the database . SQL to restore this version of Export View
7, fragmentation of Export / Import A very important application is to organize fragments. Because if Impport, Impport,
If you re-create Table, import data, so the entire sheet is continuously stored. Different default
EXPORT will generate DUMP files "Compress" Table, but this compression is in a lot of feelings
Misunderstood. In fact, Compress is the value of changing the Storage Parameters Initial. For example: CREATE TABLE .... Storage (Initial 10k next 10k ..) The data has now been extended to 100 extents. If compress = y is used to export data, the Storage (Initial 1000K Next 10K) is generated.
We can see that the next value has not changed, and initial is the sum of all Extent. So there will be
As follows, Table A has 4 100M Extens, executes delete from A, then use compress = y
Out of data, the resulting CREATE TABLE statement will have 400M Initial Extent. Even this is TABLE
No data! ! This is the DUMP file even if it is small, but it will generate a huge Table when IMPORT.
In addition, it may exceed the size of the DataFile. For example, there are 4 50m data files, where Table A has
15 10M EXTENT, if you use compress = y, you will have initial = 150m.
So when retroducing, a 150M Extent cannot be assigned because a single extent cannot span multiple files.
8. Transferring data between user and tablespace usually export's data to be restored to it. If the Scott user's table is TABLE
Or User mode export data, when IMPORT, if the Scott user does not exist, it will report an error!
The data exported in full is the information with CREATE USER, so you will create user yourself to store data.
Of course, you can use fromuser and Touser parameters when IMPORT can determine the User to import, but to ensure that Touser must already exist.
9. EXPORT / IMPORT's impact on Sques In both cases, Export / Import will have Sequence. (1) If in Export, the user is taking the value of the Sequence, which may cause the disqualification of Sequence.
(2) In addition, if sequence uses cache, when Export, the values in cache will be ignored, just take the current value export from the data dictionary.
If you use the SEQUENCE to update a column data in the Full mode,
And not the two cases, then exported is the data before updating.
If a regular path is used, each row of data is used in INSERT statement, consistency check, and INSERT T Rigger If you use Direct mode, some constraints and Trigger may not trigger, if you use sequence.nextval in Trigger, you will have Sequence. influences.