As in most project the need for a representable test data set existed in a recently started one. The data was available in a small MS Access database (7 tables). In this case a two step conversion was chosen. The first step was to use Oracle Migration Workbench (Release 10.1.0.2.2) to transform the MS Access tables to Oracle tables and to copy the data. The first step was to generate an Xml file that contains the source table structure. With this information the Oracle tables were created in the database. The column mapping was adjusted to prevent any data conversion problems ot data loss. The second step was to actually copy the data into the Oracle tables. This process went very smoothly and quick. The only drawback was that since we were only interested in the table structure and the data we didnâ€™t want all the â€œfreeâ€? stuff like trigger, sequences tablespaces etc. So we removed all of that before moving on. We would like to be able to use parameters, or something of the kind, the define want you want to migrate. Another thing we missed were ddl files. The Workbench immediately executes the chosen action. A table by table approach was not possible. The result of this process was that we had a set of filled Oracle tables, the starting point for step two.
For the second conversion we chose Oracle Warehouse Builder (Version 10.0.1.1.32 Beta, the 10gR2 beta of the OWB Paris release). We began to import the source database structure from the data dictionary in a newly created project. After importing the result can be viewed and checked in the Data Object Editor. The tables are shown on a screen with their related tables en all relevant properties. The target database structure was created in the database and alse imported in Warehouse Builder. All the requirements to create the first mapping are now available. A mapping is a graphical representation of all the steps necessary or wanted to copy data from a (source) table to a (target) table. These steps are called operators. These mappings can vary from simple 1 to 1 copy to very complicated ones witch involve a lot of steps. In our conversion it was very simple and straightforward. A few tables could be copied 1 to 1. Another few had to be split up in a master-detail tables.
To create a mapping you invoke the mapping editor. This is a graphical tool were you can use drag and drop to create the dataflow. Immediate by creating the first most simple mapping the behavior of this editor was very, very annoying. A typical dataflow is represented for the left to the right on the screen. Initially this was the case but when a property of an operator was adjusted or a new one was added the complete flow was reshuffled. A left to right flow was not detectable anymore. With simple mappings (4 to 5) operators these reshuffled pictures are still understandable. Mapping with a lot more operators become totally abracadabra and you quickly lose your goodwill creating it.
Another very annoying bug was that an operator could disappear from the screen. Because all properties are only accessible when you select a operator these disappeared ones became useless. The only option left was to delete and recreate. Probably and hopefully these are Beta version bugs.
Besides these features it was relatively simply to create intelligent pl/sql scripts to actually perform the data transfer. For example in our conversion we had to translate a meaningful (source) code to a new created technical (target) id. The Key-lookup operator did this job very nicely. Simple things like where to put a substr function on a source string take initially a lot of time because you have to find the right place to store it. This is typical for a metadata tool like Warehouse Builder. It takes time to find your way around. In the generated pl/sql nice new features like forall, merge and case were used.
We only used Warehouse Builder for a simple task and did not fully explore all the possibilities and tools. To conclude, after a first rather skeptical mindset, we executed a successful two step conversion witch resulted in a good test data set. We are aware of the potential of Warehouse Builder when your dealing with data operations between sources.