Thursday, December 22, 2016

SOA 12c: Process Large Files Using Oracle MFT & File Adapter Chunked Read Option

SOA 12c adds a new ChunkedRead operation to the JCA File Adapter. Prior to this, users had to use a SynchRead operation and then edit the JCA file to achieve a "chunked read". In this blog, I will attempt to explain how to process a large file in chunks using the SOA File Adapter and some best practices around it. One of the major advantages of chunking large files is that it reduces the amount of data that is loaded in memory and makes efficient use of the translator resources.

"File Processing" means, reading, parsing and translating the file contents. If you just want to move/transfer a file consider using MFT for best performance, efficiency & scalability.

In this example, MFT gets a large customer records file from a remote SFTP location and sends it to SOA layer for further processing. MFT configuration is pretty straight-forward and is out of scope in this entry. For more info on Oracle MFT read here.

SOA 12c offers tight integration with Oracle MFT through the simple to use MFT adapter. If the MFT adapter is configured as a service, MFT can directly pass the file either inline or as a reference to the SOA process. If configured as a reference, it enables a SOA process to leverage MFT to transfer a file.

MFT also provides a bunch of useful file metadata info (target file name, directory, file size etc..) as part of the MFT header SOAP request.
Create a File Adapter:

Drag & drop a File adapter to the external references swimlane of our SOA composite. Follow instructions in the wizard to complete the configuration as shown below. Ensure that you choose the "Chunked Read" operation and define a chunk size - This will be the number of records in the file that will be read in each iteration. For eg., if you have 500 records with a chunk size of 100, the adapter would read the file in 5 chunks.

You will have to create an NXSD schema which can be generated with the sample flat file. The file adapter uses the NXSD to read the flat file and also convert it into XML format.

Implementing the BPEL Process:

Now, create a BPEL process using the BPEL 2.0 specification [This is the default option].
As a best practice, ensure the BPEL process is asynchronous - this will ensure that the "long running" BPEL process doesn't hog threads.

In this case, since we are receiving a file from MFT, we will choose "No Service" template to create a BPEL process with no interface. We will define this interface later with the MFT adapter.

Create MFT Adapter:

Drag and drop an MFT adapter to the "Exposed Services" swimlane of your SOA composite application, provide a name and choose "Service". Now, wire the MFT Adapter service and File Adapter reference to the BPEL process we created. Your SOA composite should look like below;

Processing large file in chunks:

In order to process the file in chunks, the BPEL process invoke that triggers the File Adapter must be placed within a while loop. During each iteration, the file adapter uses the property header values to determine where to start reading.

At a minimum, the following are the JCA adapter properties that must be set;

jca.file.FileName : Send/Receive file name. This property overrides the adapter configuration. Very handy property to set / get dynamic file names
jca.file.Directory : Send/Receive directory location. This property overrides the adapter configuration
jca.file.LineNumber : Set/Get line number from which the file adapter must start processing the native file
jca.file.ColumnNumber : Set/Get column number from which the file adapter must start processing the native file
jca.file.IsEOF : File adapter returns this property to indicate whether end-of-file has been reached or not

Apart from the above, there are 3 other properties that helps with error management & exception handling.

jca.file.IsMessageRejected : Returned by the file adapter if a message is rejected (non-conformance to the schema/not well formed)
jca.file.RejectionReason : Returned by the file adapter in conjunction with the above property. Reason for the message rejection
jca.file.NoDataFound : Returned by the file adapter if no data is found to be read

In the BPEL process "Invoke" activity, only jca.file.FileName and jca.file.Directory properies are available to choose from the properties tab. We will have to configure the other properties manually.

First, let's create a bunch of BPEL variables to hold these properties. For simplicity, just create all variables with a simple XSD string type.

Let's now configure the file adapter properties.

For input, we must first send filename, directory, line number and column number to the file adapter, so the first chunked read can happen. From the return properties (output), we will receive the new line number, column number, end-of-file properties which can be fed back to the adapter within a while loop.

Click on the "source" tab in the BPEL process and configure the following properties. Syntax shown below is for BPEL 2.0 spec, since we built the BPEL process based on BPEL 2.0.

Note: In BPEL 1.1 specification, the syntax was bpelx:inputProperties & bpelx:outputProperties.
Drag & drop an assign activity before the while loop to initialize the variables for the first time the file is read (first chunk) - since we know the first chunk of data will start at line 1 and column 1.

lineNumber -> 1
columnNumber -> 1
isEOF -> 'false'

For the while loop condition, the file adapter must be invoked until end-of-file is reached, enter the following loop condition;

Within the while loop, drag & drop another assign activity to re-assign file adapter properties.

returnIsEOF -> isEOF
returnLineNumber -> lineNumber
returnColumnNumber -> columnNumber

This will ensure that the in the next loop, file adapter would start fetching records from the previous end. For eg., If you have a file with 500 records with a chunk value of 100, returnLineNumber will have a value of 101 after the first loop is completed. This will ensure the file adapter starts reading the file from line number 101 instead of starting over.

Your BPEL process must look like this;

We now have the BPEL process that receives file reference from MFT, reads the large file in chunks.

Further processing like data shaping, transformation can be done from within the while loop.

Thursday, November 17, 2016

SOA 12c RCU: Oracle XE 11g TNS listener does not currently know of SID

Recently, I installed Oracle XE 11g database on my windows machine to host my SOA 12c RCU.
Note: Although XE is not a certified database for SOA 12c, it works just fine for development purposes.

Strangely enough, my RCU utility was unable to connect to the database instance. I kept getting the error that "Unable to connect to the DB. Service not available".
I was pretty sure that all my connect parameters were correct.

Also, worth noting is that, I couldn't connect to the DB apex application running @

First suspicion was to check the service name, as sometimes during installation, the domain name gets appended to the service name. eg., instead of orcl, it might be registered as orcl.localdomain

A quick look at the listener.ora file revealed that the default service name was indeed XE.

However, when I ran the lsnrctl status command, I could see that the XE service was not listed.

Default Service           XE
Listener Parameter File   C:\oraclexe\app\oracle\product\11.2.0\server\network\admin\listener.ora
Listener Log File         C:\oraclexe\app\oracle\diag\tnslsnr\SATANNAM-US\listener\alert\log.xml
Listening Endpoints Summary...
Services Summary...
Service "CLRExtProc" has 1 instance(s).
  Instance "CLRExtProc", status UNKNOWN, has 1 handler(s) for this service...
Service "PLSExtProc" has 1 instance(s).
  Instance "PLSExtProc", status UNKNOWN, has 1 handler(s) for this service...
The command completed successfully.

This is due to the fact that the listener hasn't registered the XE service properly. In my case, restarts of database and listener services didn't help. Remember, as a best practice the listener must always be started ahead of starting the database for it to register the services.

The fix is to manually instruct the database to register the XE service. To do this, login to sqlplus as sysdba and issue the following commands.

> sqlplus / as sysdba
> Connected to:
Oracle Database 11g Express Edition Release - 64bit Production

SQL> alter system set LOCAL_LISTENER='(ADDRESS=(PROTOCOL=TCP)(HOST=localhost)(PORT=1521))' scope=both;
alter system register;

Exit sqlplus and restart your OracleServiceXE and listener services.

Now, lsnrctl status command gives the following output;


Default Service           XE
Listener Parameter File   C:\oraclexe\app\oracle\product\11.2.0\server\network\admin\listener.ora
Listener Log File         C:\oraclexe\app\oracle\diag\tnslsnr\SATANNAM-US\listener\alert\log.xml
Listening Endpoints Summary...
Services Summary...
Service "CLRExtProc" has 1 instance(s).
  Instance "CLRExtProc", status UNKNOWN, has 1 handler(s) for this service...
Service "PLSExtProc" has 1 instance(s).
  Instance "PLSExtProc", status UNKNOWN, has 1 handler(s) for this service...
Service "XEXDB" has 1 instance(s).
  Instance "xe", status READY, has 1 handler(s) for this service...
Service "xe" has 1 instance(s).
  Instance "xe", status READY, has 1 handler(s) for this service...
The command completed successfully.

You can see that XE service is now registered and ready. Also note that the http port 8080 is up and running - meaning you can now successfully access the APEX url.

Monday, November 7, 2016

Process Cloud Service (PCS) Integration Options

Process is ubiquitous - be it SaaS process extensions, automation of a manual process, gain visibility into a process or just eliminating human errors.

With fully visual, browser based, no IDE platform that runs on the cloud, Process Cloud Service lends itself as a simple yet powerful tool to citizen developers and LOB users alike, to raidly automate their business processes with little to no dependency on IT/DevOps. Cloud platform (PaaS) offerings such as Process Cloud Service and Integration Cloud Service enable modern enterprises leveraging a range of SaaS applications to extend, automate and integrate back with on-prem systems.

Outside of its own instance data, business processes also need data from external data sources. Process Cloud Service offers 3 options to seamlessly integrate with external systems / services;

3) ICS (Integration Cloud Service)

To invoke or call external services using a Service Activity within a business process, we must first create a connector - available under the Integrations section in your process composer.

Out of the box, Process Cloud Service allows connectivity to external services through SOAP / REST protocols. For any other type of integration - for eg., Database, File, Oracle/3rd party apps, you have 2 options;

1) Expose them as SOAP/REST APIs either through a middle tier or using natively available options (eg., APEX ORDS for Database) and call them directly from PCS
2) Use Integration Cloud Service (ICS) to quickly interface your target data source as SOAP/REST using a range of technology, application and SaaS adapters


With this integration option, you can connect to any SOAP web service that is accessible over internet. You have options to either upload a WSDL definition or use a SOAP URL directly.
If you are using URL, notice that all the referenced schema (XSD) files are also imported automatically.

You also have an option to configure the "Read Timeout", "Connection Timeout" and WS-Security parameters for the service.


Process Cloud Service offers extensive support to integrate and connect to REST APIs. Intuitive wizard guides through configuration of REST based services including various HTTP verbs, resources and request-response payloads.

3) ICS Integration

Process Cloud Service (PCS) provides tight-integration to Integration Cloud Service (ICS) among other PaaS / IaaS services such as Documents Cloud Service, Business Intelligence Cloud Service, Storage Cloud and Notification Service.

All it requires is a one-time configuration in PCS workspace and while modeling a process, the service connector display all ICS integrations to choose from.

With all these different integration options, Process Cloud Service not only delivers rapid process automation but also offers extensive connectivity to external systems and services.

Monday, September 19, 2016

Learner Series: Dynamically load XSL templates

I received a few requests from readers on how to dynamically load XSLT stylesheets in SOA 11g/12c.

This usecase finds importance when you have the following;

  • You have one input source (master data) and multiple destinations each of which receives the same data in different formats
  • Your destination requires minor changes on the data being sent and you don't want to redeploy the whole process causing downtime to other systems
  • You have dynamic partnerlinks to connect to multiple targets each of those targets expect data in a certain format

We will leverage SOA-MDS to store and retrieve our XSL stylesheets dynamically at runtime.

Step 1: Develop your XSLT mappings and test them before you persist them on MDS
Ref: learn how to use MDS in 11g/12c

Once you have your stylesheets in MDS, you can reference them from within your BPEL process using the oramds:/ protocol

Step 2: You don't need to use the "Transform" activity in your BPEL. Use either of the following BPEL XPath Extension Functions to load and process your XSL stylesheets within your "Assign" activity.


For example, ora:processXSLT('oramds:/apps/stylesheets/xformOrder.xsl', $inputVariable.payload)

You can further parameterize this expression by using a DVM to store the XSL references and use dvm:lookup to get the first parameter - XML template location.

Monday, September 12, 2016

Smart Connect Modern Enterprise Apps & SaaS w/ Oracle iPaaS

Modern enterprise systems stretch from ground to the cloud and everything in between.
With plethora of diverse SaaS applications combined with existing home grown/legacy on-premise applications - each promising to solve a specific business problem, the integration problem has just got complicated.

IDC predicts that by 2018, SaaS-based enterprise applications would generate over $50 billion in revenue resulting in more than 27% of enterprise apps running on cloud.

Enterprises must equip themselves to embrace this change to integrate, secure and manage their "extended" enterprise on cloud.

Most traditional integration solutions have 2 shortcomings;
One, they require considerable DevOps efforts - think development, deployment, maintenance etc..
Two, they aren't built ground-up for SaaS integrations - lack of SaaS adapters, network latencies, firewall pinholes for SaaS connectivity etc..

Hence the need for an iPaaS - Integration Platform as a Service. For any iPaaS solution to be successful, there are 3 important considerations that enterprise architects must account for;

  1. Ease of Use
  2. Time to Market &
  3. Deployment Choice

1. Ease of Use:

A cloud solution's first deliverable to business must be "Simplifying IT" and bringing IT closer to business. Oracle Integration Cloud Platform (ICS) is built for "citizen developers" and hence truly offers a "zero code" integration platform. This is a huge advantage for business and IT alike, as all technology complexities are hidden away. This means; there is no new technology to learn / ramp-up, no skill-gaps to fill, quicker turn-around times...

Oracle ICS also features a pattern-driven integration model with a bunch of common integration patterns to choose from, for a variety of integration needs including pub-sub, straight-on data mapping, orchestration etc.. all delivered just over a browser. No IDE, No installation & Zero Code.

2. Time to Market:

A huge impediment to any project plan is "TTM" delays that concerns the business.

Even for some of my customers who are on the bleeding edge of technology find it pragmatically difficult to staff, develop, administrate & manage their integration projects - partly owing to changing trends in technology but mostly because their DevOps can't scale to handle business demands. For instance, in the last 6-9 months, their sales department has bought into, HR moved to Fusion HCM cloud and Marketing is automating campaings on Eloqua. All of these are strategic initiatives driven by the line of business which offers feature-rich enterprise applications with lesser dependency on IT Ops at least to manage & maintain them.

However, care must be taken not to build silo'ed SaaS applications - there must be a robust integration platform to connect SaaS with On-prem systems without the complexities of a traditional middleware. Oracle ICS was architected ground-up with "Time to Market" as its principle goal. Integrations that typically take a few months can be up & running in a few days.

This is made possible with the ever-growing list of feature-rich SaaS, Apps and Technology adapters built for cloud, pre-built integrations and smart recommendations.

3. Deployment Choice:

Another integration decision is the "Integration Center of Gravity" which defines where the integration can be run for best performance. Let's say we want to connect 2 SaaS applications - does it make sense to run the integration on-prem behind your enterprise firewall? probably not. On the contrary, if you want to integrate 2 on-prem systems but still like to leverage the advantages that Oracle ICS offers, you have the flexibility to run ICS on-prem within your datacenter.

Oracle ICS is a truly hybrid iPaaS providing full deployment choice whether you want to run your integration platform on cloud or on ground.

Friday, September 9, 2016

Learner Series: SOA 12c Share Resources easily using MDS

SOA 12c has simplified the way developers leverage the MDS capabilities.

Developers still have the 11g way of deploying "SOA Bundle" archives to MDS described here.
But if you are on SOA 12c, you get a simpler option.

Once you install SOA 12c quickstart, you get a file-based SOA DesignTime MDS Repository by default. You can choose where to host your MDS root.

Under the Resources window in JDeveloper, expand SOA-MDS IDE connection. You must have a default SOA_DesignTimeRepository. Go to properties and set your MDS root folder.

You can copy your resources that you want to reuse/share to your MDS root folder and reference them within your SOA composite by using the oramds:/ protocol. eg., oramds:/apps/xsd/employee.xsd

Now to create a runtime MDS repository, first create a DB MDS connection to your SOA instance using your prefix_MDS schema [prefix is what you specified during RCU config].

From the Resources window -> IDE connection panel, right click SOA-MDS and create a new SOA-MDS connection. Provide a unique connection name, choose Connection type as "DB Based MDS", select the MDS connection created above and choose the "soa-infra" partition.

Note: SOA quickstart by default uses Java derby database (no DB MDS capabilities). You must either have a compact domain installation or a full-fledged SOA installation (your staging / test / production server).

To deploy the design time MDS (file-based) resources to DB based runtime MDS repository, right click on the design time repository and choose Transfer. The wizard will prompt you for the runtime MDS to which the resources must be transferred to. Choose the resources you need to transfer and your DB MDS connection and click on Transfer.

Friday, July 29, 2016

Learner Series: Handling Dynamic Arrays in SOA 12c (BPEL 2.0)

Dynamic arrays are always tricky to handle within a BPEL process. Every now and then we encounter XML schemas that have generic name-value pair arrays that are unbounded. The challenge is assigning data to and from these dynamic arrays as there are no concrete XML target elements at runtime. Hence, the XML elements must be generated first before assigning values to them.

There are multiple ways this problem can be solved.

For one, you can choose XSLT over a simple assign. XSLT wields more granular control over how the XML elements are handled. You can instruct the XSL processor to loop over dynamic array list and assign values. This of course requires some level of XSLT skill (Although JDeveloper presents an easy XSL mapper, I prefer fiddling with the source) and in certain cases, you may have to pass some BPEL variables (properties) to XSLT in additions to the source and target variables.

Second option would be to use an XML literal (called XML fragment in BPEL 1.1) within your assign activity. Here you can pre-form the XML literal, assign it to the target and then map the values. Not so sophisticated and if you are not careful with namespaces, this could cause a lot of mapping troubles.

Thirdly, with BPEL 2.0 you can simply add an attribute to your copy action in your assign activity to achieve the same. Most simplest of all.

Since this is a learner series, let's get into some details on how to go about this;

Let's first understand the root-cause of the error;

The problem with assigning dynamic arrays within BPEL is that, for the first element in the array, since the XML element is always "available", the copy happens successfully.
However, starting with the second element, all copy rules within assign would fail due to a "selection failure" with the following error because the dynamic array XML elements are still not formed or is empty.

"Exception is thrown because the to-spec at line 140 is evaluated to be empty"
"Exception is thrown because the from-spec at line 160 is evaluated to be empty"

depending on whether you are trying to copy to a target or copy from a source.

Your XML schema containing dynamic array may look something like this;

Your typical assign would like the following - you will have to manually let BPEL know which element the values must be mapped - [1], [2], [3] .... [n]. Just append this to your root XML element which contains the dynamic array. In this case;

Now, starting with the second copy rule within your assign, right-click on the rule item and select "ignoreMissingFromData" or "insertMissingToData" depending on whether your dynamic array is being read or written to respectively.

This action would add a flag (attribute) to the copy rule instructing BPEL to handle the dynamic XML element - either ignore or insert.

<copy bpelx:insertMissingToData="yes">
<copy bpelx:ignoreMissingFromData="yes">

Happy BPELing...