Pipelining 10g Database Export and Import

Lucas Jellema 3
0 0
Read Time:1 Minute, 52 Second

A very interesting feature in the Oracle 10g Data Pump I becase recently aware of is the option to link an Export job in one database to an Import job in another (or presumably the same) database through a pipe. This allows for direct data transfer without the creation of the intermediate dmp file. It also allows the Import to start processing as soon as the Export has started, without having to wait for the Export to complete. I just stumbled across an Oracle Magazine article that discusses this feature in much more detail:Tuning: Speed and Simplify Data Movement
Export and Import Over the Network

Export and Import Over the Network

Typically, Oracle Data Pump Export results in a binary format dumpfile that is then loaded (using Oracle Data Pump Import) into another database. However, in addition to this file-based Oracle Data Pump Export and Import, Oracle Data Pump provides a network-based approach for direct loading to or unloading from one database to another.

Most DBAs are probably familiar with using the legacy Oracle Export and Import utilities over a network, using named pipes as the mechanism through which to pass the data. One of the problems with that approach can be that named pipes aren’t supported on every platform.

Instead of using named pipes, Oracle Data Pump uses database links for exporting and importing data between source and target over the network. And, as with the named pipe mechanism, the network_link feature entirely bypasses dumpfile creation, transmitting data from one database to another across the network.

This feature should be of keen interest to any DBAs spending lots of time doing production-to-test refreshes for development or extracting data from a read-only data warehouse. (For a read-only database, you must use Oracle Data Pump Export’s network_link mechanism, initiating the process on the target database, because Oracle Data Pump won’t be able to create the necessary master table on the read-only database.)

Because network bandwidth is the limiting factor, the network_link capability is best used in conjunction with the filtering capability, as a means of siphoning off a subset of data (rather than an entire multigigabyte or -terabyte database) from one database to another.

About Post Author

Lucas Jellema

Lucas Jellema, active in IT (and with Oracle) since 1994. Oracle ACE Director and Oracle Developer Champion. Solution architect and developer on diverse areas including SQL, JavaScript, Kubernetes & Docker, Machine Learning, Java, SOA and microservices, events in various shapes and forms and many other things. Author of the Oracle Press book Oracle SOA Suite 12c Handbook. Frequent presenter on user groups and community events and conferences such as JavaOne, Oracle Code, CodeOne, NLJUG JFall and Oracle OpenWorld.
Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

3 thoughts on “Pipelining 10g Database Export and Import

  1. A lot of times I need quickly transfer data from my VLDB Oracle tables into flat csv files for various reasons. Even in 10g there is no efficient way to do this. We ended up using fastreader to do this job: http://www.wisdomforce.com

  2. In pre Oracle10g releases , we can also use a named pipe to export/import datas.
    mknod web.dat p
    exp userid=user/pass@host1 buffer=21971520 file=web.dat …. &
    imp userid=user/pass@host1 buffer=21971520 file=web.dat ….

Comments are closed.

Next Post

Tom Kyte a true Oracle Guru.

My impressions from the Oak table seminar with Tom Kyte. I knew Tomas Kyte only from the book ‘Effective Oracle by Design’ and was interested to hear from him live. The contents of the seminar did have a lot of stuff which can also be found in the already mentioned […]
%d bloggers like this: