We came across a use-case where we needed to pass on the data dump to a new environment that was to be created for public domain. The database servers being used was Oracle 11g. The overall process became simple by using the DATA PUMP utility. Oracle provides EXPORT and
IMPORT utility known as DATA PUMP that can be stopped and restarted. Thus providing the ability to detach and then reattach to huge jobs enables the DBA's to multitask. The utility also enables to estimate that how much space an export job would consume without actually doing the export.
Using the Export utility of Data Pump
To use the data pump a directory needs to be created in the server machine and a Directory Object needs to be created in the database which needs to be mapped to the directory created in the file system of the server. The steps are as follows:
1) Create a directory in the server file system
$mkdir saty_data_dir
2) Create a mapping to the file system directory and Directory Object
$./expdp dbuser/password@SID schemas=myschema directory=data_pump_dir dumpfile=myfilename.dmp logfile=myfilename.log
myfilename.dmp will be created in the /usr/satyam/saty_data_dir
Using the Import utility of Data Pump
The following steps need to be performed on the second server:
1) Create a directory in the server file system
$mkdir saty_data_dir
2) Create a mapping to the file system directory and Directory Object
3) Copy the myfilename.dmp to the /usr/satyam/saty_data_dir from the first server to the second
4) Import the data dump to the file
$./impdp dbuser/password schemas=myschema directory=data_pump_dir dumpfile=myfilename.dmp logfile=myfilename.log
Now you can check that the tables and the other data has been moved to the new server.
Using the Export utility of Data Pump
To use the data pump a directory needs to be created in the server machine and a Directory Object needs to be created in the database which needs to be mapped to the directory created in the file system of the server. The steps are as follows:
1) Create a directory in the server file system
$mkdir saty_data_dir
2) Create a mapping to the file system directory and Directory Object
- login to sqlplus
- ./sqlplus sys as sysdba
- SQL>create directory data_pump_dir as '/usr/satyam/saty_data_dir'
- SQL>grant read,write on directory data_pump_dir to dbuser
$./expdp dbuser/password@SID schemas=myschema directory=data_pump_dir dumpfile=myfilename.dmp logfile=myfilename.log
myfilename.dmp will be created in the /usr/satyam/saty_data_dir
Using the Import utility of Data Pump
The following steps need to be performed on the second server:
1) Create a directory in the server file system
$mkdir saty_data_dir
2) Create a mapping to the file system directory and Directory Object
- login to sqlplus
- ./sqlplus sys as sysdba
- SQL>create directory data_pump_dir as '/usr/satyam/saty_data_dir'
- SQL>grant read,write on directory data_pump_dir to dbuser
3) Copy the myfilename.dmp to the /usr/satyam/saty_data_dir from the first server to the second
4) Import the data dump to the file
$./impdp dbuser/password schemas=myschema directory=data_pump_dir dumpfile=myfilename.dmp logfile=myfilename.log
Now you can check that the tables and the other data has been moved to the new server.
No comments:
Post a Comment