In the last months we have been busy building and extending a webforms application (10g rel 2) to integrate with Oracle InterConnect. The webforms application is a home-build application for registration and calculation of costs and benefits that are related to expliotation of building projects (in dutch: ‘grondexploitatie’) for the city of The Hague and Rotterdam (both in The Netherlands). The application is called Pagoni and dates back to 1993 and is enhanced ever since. Recent changes in the application are that it integrates financial approvals to other applications (finance module within Oracle e-Business Suite) using InterConnect.
Integrating with InterConnect:
InterConnect is used because it easily let’s you integrate data between applications – Oracle and non-Oracle – in a fairly standard way. Because Pagoni is a standard application (deployed by an ever groing number of city-governments) our goal was to standarize integration: InterConnect is a good choice (within the Oracle technology stack). InterConnect provides a central hub through wich different applications (spokes) are connected via adapters. Because there are different kinds of adapters you can use, you can choose how your applications integrate with each other: via file, ftp, database tables, email, Oracle Advanced Queueing (AQ), and recently also JCA (java connectivity adapters: lets you connect to web services) and BPEL.
Having chosen InterConnect as integrator, the next big thing to decide is which technology to use on how to send mesages to the hub and back. We are integrating two Oracle applications (e-Business Suite and Pagoni) via the central hub so using an AQ-adapter or a database-adapter are the first logical choices. In my opinion database adapters have disadvantages that you must create a staging table in your application for each message type plus accompanying PL/SQL code wich is maintained via an external tool from the spoke application perspective: iStudio.
Advanced Queueing is more flexible in this area as messages are send as raw xml. So only one queue table is needed, as the message type (raw) is defined per queue table. The actual content of the xml message does not have any restriction on the definition of the multi-consumer queue table. Inside that one queue table we’ve created a queue for inboud messages and a queue for outbound messages. Actually we can create more queue’s in the future if needed in the existing queue table. To differentiate between the messages each message direction (inboud, outbound) is consumed by consumers (consumer for inbound messages, consumer for outbound mesages, etc.). Also, we don’t have to maintain PL/SQL code to implement the adapter via iStudio. Beside the usual definition of the published and subscribed messages, the application views and content based routing, only the definition of the queue and consumer are needed. All PL/SQL coding to enqueue and dequeue messages are maintained in the spoke application itself.
De xml messages are constructed via a query, using the xml-functions XMLELEMENT and XMLCDATA. The latter function is used because the content can contain “forbidden” xml characters. The query result is converted to raw using the UTL_RAW package and than enqueued to the outbound queue. Advantage of raw xml is that it removes encoding from the payload, so if the character set of the other application is different from ours, the messages are still readable.
Incomming messages are dequeued by registering a PL/SQL procedure to the inbound queue. We could opt here to maintain our hand-written dbms_job to invoke the dequeue, but via registration this is done automatically by the system, so we write/maintain less code. The PL/SQL procedure dequeues and processes the inbound message. The processing of the inbound xml is performed by converting the xml message to XMLType, using UTL_RAW followed by extracting nodes from the message.
Integrating with BPEL:
In the near future enhancements will be made to Pagoni to integrate with a BPEL engine. In a recently started pilot I’m researching how to implement this and how to prevent errors. Oracle’s BPEL Process manager offers several ways to integrate with other applications: web services, JCA, adapters for database, advaced queues, ftp, email, file, etc. My first approach is common denominator Oracle AQ. This is already set up for InterConnect, so let’s use it again for BPEL and see if it works…and it does.
Changes:
- add a new consumer to the existing queue for outbound messages and use this consumer in the BPEL-AQ-adapter. This must be another consumer than the consumer used for the InterConnect adapter. If you don’t the IC-adapter is consuming messages before BPEL has even had the change to sit at the table, lt alone start consuming.
- enter the database connection properties in file $BPEL_HOME\integration\orabpel\system\appserver\oc4j\j2ee\home\application-deployments\default\AqAdapter\oc4j-ra.xml
Be sure to use the exact same jndi-location (location=”eis/AQ/”) in JDeveloper BPEL Designer (or eclipse) is defined in the previous file in order to have the same jndi naming on the bpel-server as while developing. - Bounce the BPEL OC4J application.
- Take over the location name in the jca-adapter file.
- Build the BPEL-process.
If the first i.e. instantiating step of the BPEL process is dequeuing the inbound messages, the process will automatially wake up whenever a message for bpel-consumer is enqueued to the queue.
One can argue if I should connect Pagoni directly with BPEL, ommiting InterConnect. InterConnect is my central hub and BPEL is just another application wich is interested in messages sublished by Pagoni, so it would make sense to connect BPEL to InterConnect rather than to Pagoni. In that case no changes have to be made to Pagoni. On the IC-hub I need another adapter to connect to BPEL.
…