A few years ago I had a discussion with a colleague that resulted in an article in our consulting-magazine. He asked me if I would use InterConnect for an interface that processes 1000000 records in batch every night, were the receiving application is not interested in the records real-time.
But then your only argument to use an EAI-tool like InterConnect, Tibco or ESB that the receiving application needs the application realtime.
But it results in a costly batch that has to be maintained. The batch has to fit in the batch-window. There is no reuse of the code (transformations and data-enrichments in the batch). And also the providing application is pressurized because of requirements of the remote application. When you're able to get the change-events in the source application in real-time then you have all the information at hand at the moment the event occurs. Then you can do the transformations and enrichments at that very moment. You could do that asynchronously when these actions would cause the End-user-session to be less responsive.
Doing so you can subscribe multiple applications/services to the events. Also after processing the event the publishing application got rid of the responsibility.
If your receiving application is not able to get the mutations real-time, you can collect the mutations in a staging table. And then you only need a very light batch that just processes each mutation from the staging table in the receiving application. Since all the transformations and enrichments are done at real-time the mutations are in a processable state for the batch.
When you are to soa-enable an enterprise that uses packaged apps like Oracle E-business suite or conventional (Designer/Developer) custom-apps than you may have to break up the conventional way of working with batches. Possibly you can copy and paste transformation and data-enrichment code from the batches. But big chance that you have to rebuild it with technologies as XSLT.
When you have to build or maintain a custom-app or you work as a developer at a packaged-app-provider then also keep this in mind. Try to work event-driven. See that on every mutation event you're able to put a message on a queue (for example AQ) or publish an event to an ESB-product (Oracle SoaSuite).
Oracle Workflow had the Business Event System, which is a really nice system for publishing and subscribing to events. It is AQ based and available with every main product of Oracle (database, AS). It mainly executes in the database so for custom Forms applications, especially when they're based on Designer/Headstart/CDM Ruleframe, the ideal way to expose change-events asynchronously.
Very unfortunately Oracle decided to de-support Workflow (and thus BES). It is supported as long as the product that it's shipped with is supported. So as long as Oracle DB 10g or Oracle AS 10g is supported you can use Workflow/BES.
Actually it is quite remarkable that they de-support it. Because BES used to be written in Pl/Sql but in the very latest release (2.6.3 or 2.6.4 that shipped with 10gR2) they re-build it into a J2EE container! I wonder why they did that, because soon after that the Statement of Directoy noticed the de-support. I have two possible explanations:
- E-Business Suite still ships with embedded workflow. Release 12 is OC4J based (R11 is still based on the formally for customers desupported old Oracle 9i Application Server version 1.0.2.2). In R12 the Oracle embedded Workflow will still be available and so the BES can be J2EE based. Also it makes it simpler to have Java-calls on events. In 2.6.2 it used some kind of D-tour-solution from Pl/Sql to java.
- Did you ever took a glance at the datamodel of Oracle SoaSuite ESB 10.1.3? You will see some tables of ... indeed: Oracle Workflow/BES!
No comments:
Post a Comment