Wednesday 17 December 2014

Send HTML Formatted email using SoaSuite 11g/12c

This week I stumbled upon a question on the SOASuite forum, on the Oracle communities, that where the questioner wanted to send HTML-formatted emails.

It happens that I was busy with the email-adapter myself, and it would be nice to have neatly formatted email. It took me some time, but I managed to do it.
It basically consist of:
  1. Define an outbound email adapter config, with Opaque element. This expects an Base64 encoded payload, which enables you to put in everything you want (given it is valid for your receiver)
  2. Create an HTML payload as a string, simply by concatenating all the html code, your content and probaly variable-content.
  3. Use an embedded java activity to unencode the XML encodings and Base64 encode it.
  4. Copy the Base64 encoded payload to the opaque message payload, fill in the subject and the to-email-adres
  5. Invoke the email adapter, with the following properties on the invoke activity:
    1. -> based on a variable
    2. jca.ums.subject -> based on a variable
    3. jca.ums.msg.content-type -> based on an expression: "text/html"
The last-property is important: it tells the receiving application to interpret the message as html.

Read the complete how-to here.

When I have time, in a later stage I'll update this message to transfer the content to this page.

Monday 15 December 2014

ora:getAttachmentProperty: Failed to decode properties string

Last week I created a process that polls the usermessagingservice-email adapter to read emails and store the email with attachments as Oracle ACM case documents.

For simple e-mails this works fine, but with some emails I get the error:
0An error occurs while processing the XPath expression; the expression is ora:getAttachmentProperty('Content-Type', 'ReceiveMessage_ReceiveNotification_InputVariable','body', '/ns2:message/ns2:attachment[$AttachmentId]').XPath expression failed to execute.

An error occurs while processing the XPath expression; the expression is ora:getAttachmentProperty('Content-Type', 'ReceiveMessage_ReceiveNotification_InputVariable','body', '/ns2:message/ns2:attachment[$AttachmentId]').

The XPath expression failed to execute; the reason was: java.lang.RuntimeException: Failed to decode properties string ,att.contentId=1,Content-Type=multipart/related;

    boundary "---- _Part_188_790028878.1418047669530",.

Check the detailed root cause described in the exception message text and verify that the XPath query is correct.

XPath expression failed to execute

This occurs with both the functions ora:getAttachmentProperty and ora:getAttachmentContent.

It turns out that attachements that fail, are embedded attachments, like company logo's in autographs. They have a content-type like: "multipart/related", see the propertie string like:
 boundary "---- _Part_21_137200243.1418648527909",/related;
But 'real' attachments did work, and had a properties string like:
Content-Transfer-Encoding=base64,Content-Disposition=attachment,att.contentId=2,Content-ID=<>,Content-Type=image/jpeg; name DataSources.jpg,

Apparently, if the content-type in the properties string does not start with 'multipart', then the attachment is processable.
When you open the BPEL ReceiveMessage_ReceiveNotification_InputVariable input variable I got something like:
  <part xmlns:xsi="" name="body">
      <message xmlns="">
         <attachment href="409425ae-8448-11e4-a413-000c297d0a6d"/>
         <attachment href="409907af-8448-11e4-a413-000c297d0a6d"/>
         <attachment href="4099a3f0-8448-11e4-a413-000c297d0a6d"/>
         <attachment href="409a4031-8448-11e4-a413-000c297d0a6d"/>
         <opaqueElement xmlns="">PGh0bWwgeG1sbnM6dj0idXJuOnNjaGVtYXMtbWljcm9zb2Z0LWNvbTp2bWwiIHhtbG5zOm89InVy

In the Soa Infra database a table exists named 'ATTACHMENTS', with three columns: 'KEY', 'ATTACHMENT' and 'PROPERTIES'.

When you do a select on that table where the key the @href attribute, then you'll find the attachment. The PROPERTIES column contains the properties string like mentioned above.

So I created a DB Adapter config, with JNDI name 'eis/DB/SOA' (referring to already configured datasource in the DB Adapter), on this table. In the for-each loop I first query the attachment using the href attribute.

Then I extract the content-type using the expression
substring-before(substring-after($InvokeQuerySoaInfraAttachments_QuerySoaInfraAttachmentsSelect_OutputVariable.AttachmentCollection/ns6:Attachment/ns6:properties, 'Content-Type='),';')

In an IF statement using an expression like
not(starts-with($contentType, 'multipart'))
I only process those attachments that has content-type that is not multipart.

Probably a more sophisticated expression can be found.I could check on a list of only supporting mime-types.

To me this seems like a bug: SOASuite should be able to get the available properties out of the string to be able to process this.The main problem is also that the BPEL fault-handler is ignored: when the error occurs, the process fails! So I can't catch the exception and act on it. And honestly: I should not query the table myself, using the DB adapter, do I?

By the way: working with 12c (12.1.3), but I assume the same will occur in 11g.

Wednesday 3 December 2014

Integrating BPM12c with BAM

In BPM12c there is a tight integration with BAM, like it was with 11g. However, BAM is not automatically installed in BPM. You need to do that seperately. But having done that, you need to get BPM acquainted with BAM and instruct it to enable process analytics.

For a greater part the configuration is similar to the 11g story. My former oracle colleague wrote a blog about it: 'Configuration of BAM and BPM for process analytics'.
There is a little difference, because in 12c you won't find the mentioned property 'DisableActions' under the '' -> 'BPMNConfig'. But you have to enable process analytics on the bpm-server(s). The 12c docs tell you how: '11.1 Understanding Oracle BAM and BPM Integration'.
Taken from that document here a short step-list:
  1. Log in to the Fusion Middleware Control (http:Adminserver-host:port/em) console. 
  2. In the Target Navigation pane, expand the Weblogic Domain node. 
  3. Select the domain in which the Oracle BAM server is installed. 
  4. Open the MBean Browser by right-clicking on the domain and select System MBean Browser. 
  5. Expand the Application Defined MBeans node. 
  6. Navigate to node -> 'Server: server_name' -> AnalyticsConfig -> analytics.
  7. Disable the 'DisableProcessMetrics'-property by setting the value to false. 
  8. You might want to do the same with the 'DisableMonitorExpress'-property.
  9. Click Apply.

Tuesday 25 November 2014

Using DB Adapter to connect to DB2 on AS400

In my current project I need to connect to a DB2 database on an AS400. To do so is no rocket science, but not exactly a NNF (Next-Next-Finish) config.

First you need to download the IBM JDBC adapter for DB2, which is open souce. Download the JT400.jar  from Place it in a folder on your server. Since it's not an Oracle driver, I don't like to have it placed in the Oracle-Home, so I would put it on a different lib folder, where it is recognisable. Create a logical one, where you place other shared libs as well.

There are several methods to add the lib to your weblogic class path. What worked for me was to add it to the 'setDomainEnv.cmd'/'' file in the domain home.

(The Default Domain of the integrated weblogic of JDeveloper 12.1.3 under Windows can be found in: “c:\Users\%USER%\AppData\Roaming\JDeveloper\system12.\DefaultDomain”)
Search for the keyword ‘POST_CLASSPATH’ and add the following at the end of the list of POST_CLASSPATH-additions:
set POST_CLASSPATH=c:\Oracle\lib\jtopen_8_3\lib\jt400.jar;%POST_CLASSPATH%
Where 'c:\Oracle\lib\jtopen_8_3' was the folder where I put it under windows. Then restart your server(s), and create a DataSource. For 'Database Type' as well as for 'Driver' choose 'Other' in the wizard. Then for the following fields enter the corresponding values in the given format (see also the doc.):
URLjdbc:as400://hostname/Schema-Name;translate binary=true
Driver Class
Driver Jar

Since in our case the database apparently has a time out (don't know it this is default behaviour with DB2-AS400), I put in a one-row-query in the Test Table Name-field. And I checked the , because I don't know the time-out frequency.

A description of configuring the library and connection in JDeveloper and the DBAdapter can be found in section 9.6.2 of this doc.

Having the DataSource in Weblogic setup, you can register it in de Database Adapter. Besides provinding the DataSourceName or XADataSourceName you should adapt the PlatformClassName:

The default is '' (It only now strikes me that it contains 'org.eclipse.persistence' in the package name). Leaving it like this could have you running in the exception:
ConnectionFactory property platformClassName was set to but the database you are connecting to is DB2 UDB for AS/400
For DB2 on AS/400, the value should be: 'oracle.tip.adapter.db.toplinkext.DB2AS400Platform', see the docs here.

Monday 24 November 2014

Reminder to myself: turn off felix service urlhandlers in combined BPM & OSB12c installation

Last week I started with creating a few OSB services for my current project, which is in fact a BPM12c project that needs to be interated with database services on an AS400, thus DB2. Firstly I found that when I tried to deploy on a standalone wls domain (created with the qs_config script), it lacks an OSB installation. Whereas the integrated weblogic default domain has one.

But when I try to deploy to a pretty simple project I ran into the fault 'The WSDL is not semantically valid: Failed to read wsdl file from url due to -- Unknown protocol: servicebus.'

I even tried to do an import of a configuration.jar into the sbconsole, but same error here.

Frustration all over the place: how hard can it be, beïng quite an experienced osb developer on 11g?

Luckily I wasn't the only frustrated chap in the field: Lucas Jellema already ran into it and found a solution, where he credited Daniel Dias from Middleware by Link Consulting.

Friday 31 October 2014

OSB12c: Errorhandling in REST

Yesterday, I had an OSB consulting day at a customer. We looked into a REST service that was to be extended with update functionality. Since calling an update service of an EIS (Enterprise Information System) can go wrong with all sorts of errors, it is important to be able to return a fault-message with the errors, JSON format.

Now in OSB12c it's very apparent how you define possible fault-messages and even how the should be formatted in JSON:

In this sample case we created a more or less simple xsd for faults (dutch: fouten). To test with different fault messages we simply duplicated the 'fouten' element in the xsd to 'fouten2'. You can assign different HTTP-status codes to the different fault.

So this is configuration is pretty simple and straight forward. But it is not quite clear in the documents how you would return a specific fault within your error-handlers in the pipeline.

Internally OSB works not only 'XML'-based but actually SOAP-based. So the trick in the end is to replace the body with a soap-fault message and the selection of the REST/JSON errormessage is done based on the structure of the document in the details-section of the SOAP-Fault. In the screen above, you would define for each fault message an xsd-element and apparently it validates the soap-fault-details content against each XSD defined, and the xsd against which the detail-content is valid points to the returned fault, with the corresponding HTTP Status.

So we created a XQuery transformation as follows:
xquery version "1.0" encoding "utf-8";

(:: OracleAnnotationVersion "1.0" ::)

declare namespace ns2="";
(:: import schema at "../PS/Schemas/fouten.xsd" ::)
declare namespace ns1="";
(:: import schema at "../BS/Schemas/XMLSchema_-130439696.xsd" ::)
declare namespace soap-env="";

declare variable $input as element() (:: schema-element(ns1:ServiceErrorMessage) ::) external;

declare function local:func($input as element() (:: schema-element(ns1:ServiceErrorMessage) ::)) as element() (:: schema-element(ns2:fouten) ::) {
            for $detail in $input/ns1:detail
                <ns2:ErrorMessage>{fn:concat("ERROR: ", fn:data($detail/ns1:message))}</ns2:ErrorMessage></ns2:ErrorMessages>
Of course the actual fault detail must follow the xsd for that particular fault. We tested but the faultcode or fault string does not have any affect in selection of the REST-fault or HTTP statuscode.
With the xquery above we got the 'fault' returned as defined in the REST definition, as shown in the screendump above.
 In our example, if we would changed the contents of this xquery and replace the tag <ns2:fouten> to <ns2:fouten2> then we got the other fault (fault2), with the corresponding HTTP-status.
A detail with contents that does not correspond to any of the defined fault-xsd's would result in HTTP-status 500: internal server error. So it is important to have a proper transformation where the returning fault-detail is valid to at least one of the fault-xsd's.

Another conclusion is that since the fault selection is apparently based on the detail-contents against the registered fault-xsd-elements, you apperently can't have different faults with the same xsd. Since JSON is 'namespace-less', you probably can solve this by defining several copies of the same xsd with a different namespace, one for each fault. The choosen namespace in the xquery would then be the selector for the fault. But since the underlying element names are the same, it would not differ in the resulting JSON-message. Of course in an XML result it would differ.

Wednesday 29 October 2014

BPM & SOA Application missing in JDeveloper 12c gallery

A few weeks ago I did a BPM12c Quickstart Installation under Oracle Linux 6. Everything went smoothly, as described in the install guide as well as on many blogs already.
But I found that most of those blogs did an installation under Windows, where I did it under Oracle Linux in Virtualbox.

You would think (as I did) that it shouldn't matter. However, it turns out that in JDeveloper I was missing the 'BPM Application' amongst others in the JDeveloper New Gallery. Very inconvenient. I couldn't find any hints on the big internet. My friend Google wasn't very helpful in this.

But I wouldn't write this blog if I did not solve it. It turns out that with an update I got it solved.

It turns out that I lacked the 'Spring & Oracle Weblogic SCA' extension. Using the Help->Update functionality I downloaded and installed that and after restarting JDeveloper my 'New Gallery' was properly filled.

For those not so familiar with the JDeveloper update mechanism, here a step by step guide:
  1. Choose Help->Update:
  2.  Leave the Update Centers checked as default and click Next:
  3. Check 'Spring & Oracle Weblogic SCA' and click Next:
  4. Click Finish:
  5. Confirm when asked for restarting JDeveloper.
Update 2014-12-01, last week I also found this document on oracle support to do a clean of your jdeveloper.

Tuesday 28 October 2014

Demo User Community in BPM 12c Quickstart

When you want to do demo-ing or perform the BPM12c workshops on a BPM12c QuickStart developers installation you'll need the Oracle BPM Demo User community. You know: with Charles Dickens, John Steinbeck and friends.

How to do so, you can find on the page following this link. You'll need the demo-community scripting that can be found following this link, and then download 'workflow-001-DemoCommunitySeedApp'.

However, besides adapting the file, there are a few changes to make in the build.xml.

First, find the property declaration for 'wls.home', and change it to:
<property name="wls.home" value="${bea.home}/wlserver"/>
This is needed, since they renamed the folder for the weblogic server in the FMW12c home. Then, after the comment of 'End Default values for params', add the following
  <!-- Import task def for antcontrib functions as if -->
  <property name="ant-contrib.jar" value="${bea.home}/oracle_common/modules/net.sf.antcontrib_1.1.0.0_1-0b3/lib/ant-contrib.jar"/>
  <taskdef resource="net/sf/antcontrib/antlib.xml">
      <pathelement location="${ant-contrib.jar}"/>
This is because the script lacks a definition for the ant-contrib lib, needed amongst others for the 'if' activities. After this change, it worked for me.

Wednesday 24 September 2014

JMS Properties in BPM Suite

Lately I needed to transfer messages from OSB to BPM Suite. The setup was that OSB calls a Webservice implemented by a Mediator component that publishes the message on a JMS queue. The webservice request contains a mesage header and a message payload. The payload published as the JMS message-content, but the message header elements were added as custom JMS-properties to the message. How to do that is quite easy and described for instance in this blog.
The JMS Adapter forJMS documentation can be found here, and the particular part about the properties here.
In essence, in the Assign Values part of the Mediator configuration, custom JMS-properties can be referrered to as

The BPM Process listens to the queue and needed to be adapted to read the JMS-properties. I could not find info about that, but it turns out to be simple: in the ServiceCall activity you have the Service Properties link:
 This will bring up the service properties dialog:
The green-plus and pensil icons can be used to add and edit the service property assignments:

The property name is of the same structure as with the mediator: ''.
The expression must denote an automatically instantiated dataobject of type string. I tried to use a structured business object, but then apparently only the root element is instantiated. On run-time I got errors that the assignment of the JMS-property to the particular element failed. So for each JMS-Property I created a separate Dataobject of type string. And that works perfectly.

Service VersionInfo in OSB

When deploying a SCA project (SOASuite or BPMSuite) from JDeveloper you'll be asked to provide a version number. Often when deploying using a script (ANT or Maven) based deployment to a test, acceptance or production environment, the used deployment-framework may use a release number for the versions of the different composites. Besides the support of different versions of composites to reside side-by-side in SOASuite, this is convenient, because this way you know what version of the composite is deployed on the particular server. Then it is certain that a deployment of a particular version has succeeded and that the particular change that you ment to deliver is or is not in use.

In OSB there's no such thing as side-by-side support of versions and also you can't see what particular version of a OSB  Configuration is deployed.

Since I was asked several thimes by project managers to denote different versions of a particular service in different releases I came up with a very simple solution.

First I defined an xml schema, like the following:
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns:xsd=""
            xmlns:ver="" elementFormDefault="qualified">
  <xsd:element name="VersionInfo" type="ver:versionInfoType">
      <xsd:documentation>Version info</xsd:documentation>
  <xsd:complexType name="versionInfoType">
      <xsd:element name="versionNr" type="xsd:string"/>
      <xsd:element name="versionDate" type="xsd:string"/>
      <xsd:element name="author" type="xsd:string"/>
      <xsd:element name="change" type="xsd:string"/>
      <xsd:element name="svnId" type="xsd:string"/>
In my setup this xsd is placed in a separate shared xsd-project.
Then in the root of each OSB Service-project I place an xquery file as follows:
xquery version "1.0" encoding "UTF-8";
<ver:VersionInfo xmlns:ver="">
  <ver:author>M. van den Akker</ver:author>
  <ver:change>ChangeNr - Particular change</ver:change>  
  <ver:svnId>$Id: versie.xq 1520 2014-09-17 10:25:29Z makker $</ver:svnId>  
This xquery is nothing more than an xml-file with an xquery-prolog, a trick you can use to externalise hardcoded properties from your proxy-services. So it doesn't have input and output variable declarations. You could replace certain fields like versionNr with ant-properties to replace it during an ant deployment. In my current setup I change it by hand when commiting the changes in Subversion.

In my proxyservices at the 'entry points' of the services, the start of the first stage in the message flow, I add an assign and an alert. In the Assign the xquery is executed to a versionInfo variable:
Assign with VersionInfo xquery
 Then an alert is added to have the version info shown after the call of the service. This way you can see with what version of the service the request is handled:
Alert of version information
The expression of the alert is:
concat('Version: ',$versionInfo/ver:versionNr,', date: ',$versionInfo/ver:versionDate)
This is where the xsd comes along, because to help you enter this expression, but especially to get this valid (to get the namespace added), you must add the versionInfo element of the xsd as the versionInfo variable. You could also add the namespace manually, but adding the variable structure is more convenient.

This alert is actually optional, but can be usefull when debugging, investigating services and service executions. The xquery resides after deployment on the OSB Server and can be reviewed using the Project Explorer in the Servicebus console. It can be hard for the system-administrator to conclude from script output if a deployment succeeded. The comparison of the versionInfo.xq on the OSB with the version in a release-document that you deliver with your deployment can help him out. OSB commits the deployment as a whole, so the correct version-number in the versionInfo.xq file indicates a successfull deployment.

Another trick you could notice in the screendump is that I use a seperate alert destination for development/debug messages. I found that you can't set the loglevel on an alert. You need to do that on each proxyservice in a compound service configuration. Because I found that very inconvenient, I added a seperate alert destination for debug messages. On acceptance and/or production you can switch of the logging of debug messages on that particular destination.

Monday 2 June 2014

Message Correlation using JMS

Last year I created a few OSB services with the asynchronous request response message exchange pattern. OSB does not support this out of the box, since OSB is in fact synchronous in nature. Although OSB supports the WS-Addressing namespaces, you need to set the WS-Addressing elements programmatically.

Since OSB is synchronous the request and response flows in the Asynchronous Request/Response pattern are completely seperated implemented from eachother. That means that in the response flow you don't know what request message was responsible for the current response. Even worse: you don't know what client did the request and how to respond to that client in a way you can correlate to the initating instance. Using SOA/BPM Suite as a client, you want to correlate to the requesting process instance.

There are of course several ways to solve this. I choose to use a Universal Distributed Queue for several reasons, where knowledge of JMS and performance were a few. I only need to temporarly store a message against a key. Coherence was not on my CV yet. And a database table requires a database(connection) with the query-overhead, etc.

Unfortunately you can't use the OSB transports or SOASuite JMS adapters to get/browse for a message using a correlation-id in a synchronous way. When you create a proxy service on a jms transport or configure a JMS Adapter for read it will be a polling construction. But it's quite easy to do it in Java, so I created a java-method to get a message based on a CorrelationId.

One thing I did not know back then was that if you put a message on the queue from one OSB Server Node (having a JMS Server) it can't be read from the other node, as such. Messages are stored in the local JMS Server member of the Queue.

I found that you can quite easily reach the local member of a Universal Distributed Queue on a certain JMSServer on Weblogic by prefixing the JNDI name of the queue with the JMSServer separated with the at-sign ('@'):

    String jmsSvrQueueJndi = jmsServer +"@" +queueJndi

The problem now is: "how do you know which JMS Servers are available servicing your queue?"
I solved that by providing a comma seperated string of JMS Server names as a parameter in the Java-Callout in my proxy service. But that left me with a more or less hardcoded string of JMS Server names, that needs to be expanded when a second OSB Server node with a JMSServer instance is added. I figured that I should be able to query that from the WebLogic instance. And of course it can: that's I'm writing the blog.

First you need to get a connection to an MBeanServer in WebLogic. You can create a remote server connection, but since my java class is running on the OSB Server within WebLogic, a JNDI-lookup is better.

I reused a class that I created in the past for doing JNDI lookups of JDBC-Connections for both inside or outside the application server. I'll add it at the end of this blog entry.

To get a MBServer connection you simply get a JNDI Context from the JNDI Provider above do a lookup of the JNDI name "java:comp/env/jmx/runtime":
mbServer = (MBeanServer) jndiContext.lookup("java:comp/env/jmx/runtime");
 The documentation for this can be found here.

From the MBean Server you can get a list of JMS Servers by first instantiating a jmx 'ObjectName' using:
ObjectName queryObjName = new ObjectName("com.bea:Type=JMSServer,*");
Here we're about to ask for all MBeans of type 'JMSServer'.

 I named the JMSServers in my setup each with the same prefix 'OSBJMSServer' and a number like 'OSBJMSServer1', 'OSBJMSServer2'. So I want all the JMS Servers that start with 'OSBJMSServer'. For this I need to provide a jmx query expression. Luckily this is quite simple:
QueryExp queryExp = Query.initialSubString(Query.attr("Name"), Query.value("OSBJMSServer"));
Here I create a query expression that queries on the initial substring on the "Name" attribute of the ObjectName, that equals the query value "OSBJMSServer". I found a blog on the jmx query possibilities here.

Having this in place you only need to perform the query:
Set objNames = mbServer.queryNames(queryObjName, queryExp);
This gets you a set with all the ObjectName objects that matches the query-expression.

Now since I only need the plain name's of JMSServers (not their canonical names) I can add them easily into a List:
  List jmsServers = new Vector();
  for (ObjectName objName : objNames) {
   String jmsServerCanonicalName = objName.getCanonicalName();
   String jmsServerName = objName.getKeyProperty("Name");
   lgr.debug(methodName, "JmsServerCanonicalName [" + jmsServers.size() + "]: " + jmsServerCanonicalName);
   lgr.debug(methodName, "JmsServerName [" + jmsServers.size() + "]: " + jmsServerName);

For the canonical name there is a getter. For the Name property you need to use the getKeyProperty() method.

WebLogicJms Class

 The complete WeblogicJms class:
package nl.darwin-it.jms;

import java.util.List;
import java.util.Set;
import java.util.Vector;

import javax.naming.Context;
import javax.naming.NamingException;

import nl.darwin-it.jndi.JndiContextProvider;
import nl.darwin-it.log.Logger;

 * @author Martien van den Akker, Darwin-IT Professionals Version 1.1 , 2014-06
 *         Class with methods to query JMS artifacts from WeblogicJMS
public class WeblogicJms {
 public static final String MBSVR_JNDI = "java:comp/env/jmx/runtime";
 public static final String JMSSVR_OBJNAME_BASE_QRY = "com.bea:Type=JMSServer,*";
 public static final String JMSSVR_QRY_ATTR = "Name";
 public static final String JSBJMSSVR_NAME_PREFIX = "OSBJMSServer";

 private static final String className = "WeblogicJms";
 private static MBeanServer mbServer = null;
 private static Logger lgr = new Logger(className);

  * Get a lazy instantiation for the MBServer and cache it.
  * @return
  * @throws NamingException
 private static MBeanServer getMBServer() throws NamingException {
  if (mbServer == null) {
   final Context jndiContext = JndiContextProvider.getJndiContext();
   mbServer = (MBeanServer) jndiContext.lookup(MBSVR_JNDI);
  return mbServer;

  * Get a list of JMS Servers belonging to OSB (OracleService Bus)
  * @return
  * @throws NamingException
  * @throws MalformedObjectNameException
  * @throws NullPointerException
 public static List getOSBJMSServers() throws NamingException, MalformedObjectNameException,
   NullPointerException {
  final String methodName = "getOSBJMSServers";

  List jmsServers = new Vector();

  final MBeanServer mbServer = getMBServer();

  ObjectName queryObjName = new ObjectName(JMSSVR_OBJNAME_BASE_QRY);

  QueryExp queryExp = Query.initialSubString(Query.attr(JMSSVR_QRY_ATTR), Query.value(JSBJMSSVR_NAME_PREFIX));

  Set objNames = mbServer.queryNames(queryObjName, queryExp);
  lgr.debug(methodName, "Found " + objNames.size() + " objects");
  for (ObjectName objName : objNames) {
   String jmsServerCanonicalName = objName.getCanonicalName();
   String jmsServerName = objName.getKeyProperty("Name");
   lgr.debug(methodName, "JmsServerCanonicalName [" + jmsServers.size() + "]: " + jmsServerCanonicalName);
   lgr.debug(methodName, "JmsServerName [" + jmsServers.size() + "]: " + jmsServerName);

  return jmsServers;

  * Get the list of JSB JMS Servers in XML format serialized to string.
  * @return
  * @throws MalformedObjectNameException
  * @throws NullPointerException
  * @throws NamingException
 public static String getOSBJMSServersXML() throws MalformedObjectNameException, NullPointerException,
   NamingException {
  final String methodName = "getOMSServersXML";
  StringBuffer jmsServersXMLBuf = new StringBuffer("");
  List jmsServers = getOSBJMSServers();
  for (String jmsServer : jmsServers) {


  return jmsServersXMLBuf.toString();



JNDI ProviderClass

The JNDI Provider Class:
package nl.darwin-it.jndi;

import javax.naming.Context;
import javax.naming.InitialContext;
import javax.naming.NamingException;

import nl.darwin-it.log.Logger;

 * Class providing and initializing JndiContext.
 * @author Martien van den Akker
 * @author Darwin IT Professionals
public abstract class JndiContextProvider {
 private static final String className = "JndiContextProvider";
 private static Logger lgr = new Logger(className);
 private static Context jndiContext = null;

  * Get an Initial Context based on the jndiContextFactory. If
  * jndiContextFactory is null the Initial context is fetched from the J2EE
  * environment. When not in a J2EE environment, the factory class is fetched
  * from the file that should be in the class path.
  * @return Context
  * @throws NamingException
 private static Context getInitialContext() throws NamingException {
  final String methodName = "getInitialContext";
  Context ctx = new InitialContext();
  return ctx;

  * Create a Subcontext within the Initial Context
  * @param subcontext
  * @throws NamingException
 public static void createSubcontext(String subcontext)
   throws NamingException {
  final String methodName = "createSubcontext";
  lgr.debug(methodName, "Create Subcontext " + subcontext);
  Context ctx = getJndiContext();

  * Set jndiContext;
 public static void setJndiContext(Context newJndiContext) {
  jndiContext = newJndiContext;

  * Get jndiContext
 public static Context getJndiContext() throws NamingException {
  final String methodName = "getJndiContext";
  if (jndiContext == null) {
   lgr.debug(methodName, "Get Initial Context");
   Context ctx = getInitialContext();
  return jndiContext;

Tuesday 20 May 2014

OSB Publish: exception with apparent valid XSLT

I ran into this earlier, but today it happened to me again, so apparently a blog would be a good idea.

I use quite some Xslt's in my OSB Project. Merely because I'm more used to XSLT than in XQuery, but also because I've the feeling that it is better suitable in tranforming big XSD. And I found a few minor problems with some XSD's in the XQuery mapper of OEPE.

Anyway, in an XSLT there is a special attribute "exclude-result-prefixes" that allows you to name all the namespace-prefixes that you declare for referencing functions that should be excluded from the resulting xml.
Today (as I did before) I added some prefixes to the attribute. Refreshing my the XSLT folder in OEPE and publishing the project lead me into the following exception:
Conflicts found during publish.
[MyPipelinePair, Response Pipeline, MyOperation-TransformResponse, Replace action] XQuery expression validation failed: An error occurred creating the XSLT executor: javax.xml.transform.TransformerException: java.lang.IllegalStateException: The XSLT resource "XSLT MyService/CBS/V1/xslt/MyXSLT" is not in a valid state..
javax.xml.transform.TransformerException: org.xml.sax.SAXParseException; lineNumber: 19; columnNumber: 69; Element type "xsl:variable" must be followed by either attribute specifications, ">" or "/>".
[MyPipelinePair, Response Pipeline, MyOperation-TransformResponse, Replace action] XQuery expression validation failed: An error occurred creating the XSLT executor: javax.xml.transform.TransformerException: java.lang.IllegalStateException: The XSLT resource "XSLT MyService/CBS/V1/xslt/MyXSLT" is not in a valid state..

Reading this error, what do you think was wrong with my XSLT?
At first I indeed checked my variable declarations, but I did not touch those. It appeared however that  publisher in OEPE has a limitation in the attribute-length for this attribute.It's one of those errors that does not say what it means at all.

I use the JDeveloper XSLT-mapper, and if you create a new XSLT with it, by default you're provided with several namespace-declarations for xpath-extensions functions for bpel, bpm/bpmn, humanworkflow, ldap, mediator, etc. Most of them you won't use, especially in an OSB project, since (in 11g that is) those functions are not supported by OSB. So you can remove all the unnecessary namespace declarations from the header, making them obsolete for the "exclude-result-prefixes"-attribute. This makes not only the XSLT-header shorter, but it allows you to cleanup the "exclude-result-prefixes" as well.

When the "exclude-result-prefixes"-attribute is shortened enough, the xslt is publishable again.

Wednesday 7 May 2014

OSB: SplitJoin

Yesterday I had a mail-exchange with a colleague. He was giving an ADF-course, where students complained that in the market little was known about the techniques that they had to work with. Espessially that there's nobody who knows everything. The actual question was about security and SAML. This post is not about that, see my previous one. But I figured that although I did quite a lot with SOA/BPMSuite, OSB and other FMW tools, I do not know everything about Oracle the FMW SOA tools. Today I took a look into the SplitJoin component of OSB. The last days I created a compound service in an OSB Proxy Service, where the different sub-components could be executed in parallel. Having a closer look into the SplitJoin component it actually sort of amazed me. Although you can do a sort of process modelling in a proxyservice, including if-then-else, loops, etc. you can't do parallism. The name of the SplitJoin already suggests that this is intended to solve just that. But the name of the component not nearly suggests what it actually is. To create a split join, you'll have to create a new component:
Then OEPE asks for a binding/operation in a wsdl in your project. The resulting component has an extension that a little better suggests the specifics of the component: '.flow'.

For this post I created a more or less dummy SplitJoin, and added a scope element with a fault-handler:

An experienced SOASuite developer might recognize the structure. For me another hint was the Design Palette:
The SplitJoin is actually a BPEL-Process! So you're not only can do parallel flows, but in the pallete you can see that there are several flow-control constructs available for the process:
  • For Each
  • If
  • Parallel
  • Raise/Reraise Error
  • Repeate Until
  • Scope
  • While
  • Wait
I'm not going to give a BPEL-small-course here. If you google on 'OSB SplitJoin Examples' you'll get several examples of SplitJoin constructions.

I will make several remarks here.
Firstly, given the names and constructions, apparently the supported BPEL version is BPEL 2.0. Secondly, you would think that you could inter-exchange bpel processes from SOASuite-BPEL. However if you implement several Assign Operations, you'll see that the different operations on the messages/variables are done in BEA-extensions. The Log operation is also a BEA-extension to the BPEL-Assign construct.

Furthermore, I found that you can't introspect into the xml-structure of the messages in the expression builders of the assign operations. Like I'm used to in JDeveloper BPEL Designer, or even in the OEPE ProxyService 'MessageFlow'-modeller. This is very inconvenient, because you'll have to edit the xpath-expressions by hand. And I found by trial and error that you couldn't refer to the root-element of the element. If you log a variable, you'll see the root-element. But apparently you'll have to refer to the sub-elements relatively from the variable. like: '$ServiceReply.part1//ns1:subElement'.

Although you can model a complete BPEL process in the SplitJoin, I would be reluctant to use it excessively in OSB. For a complex BPEL process I would use Oracle SOASuite. Not only because the Jdeveloper BPEL Designer is more convenient, the support of the Runtime Engine in EnterpriseManager is much better. In OSB you can't introspect into a running/completed instance as you can in EM.

Lastly, you can't call a SplitJoin/BPEL-flow directly from your ProxyService. You'll need to generate a BusinessService on it. This is actually quite simple: just right click on the SplitJoin -> Oracle Service Bus -> Generate Business Service.

In my case I created a simple wsdl specially for the SplitJoin solution. This one is then called from my proxy-service only for the parallelism part. The request and response messages are based on a separate xsd with the different request and response elements combined. In the ProxyService I already created functionality to aggregate and transform the response-parts to a combined response.
If you open the source of the SplitJoin-BPEL-flow, you'll get:
<?xml version="1.0" encoding="UTF-8"?>
<bpel:process name="SplitJoin"
        <bpel:partnerLink name="SplitJoin" partnerLinkType="tns:SplitJoin"

        <bpel:variable name="request"
        <bpel:variable name="response"

        <bpel:receive partnerLink="SplitJoin" operation="serviceOperation" variable="request" createInstance="yes">
                <rescon:wsdl ref="CDM/wsdl/V1/CBS/service" binding="bind:Service_pttSOAP11Binding"/>
        <bpel:reply partnerLink="SplitJoin" operation="serviceOperation" variable="response"></bpel:reply>

The splitJoin is a nice component. But it's unclear to me why a seperate engine is added just to be able to do parallism. And the designer could be improved. But better still: leave it over to the SOASuite BPEL designer and engine. For me personally I'm not sure to what extend I should use this..

Friday 11 April 2014

Service Provider initiated SSO on WLS11g using SAML2.0


At a recent customer I got the assignment to implement a SAML 2.0 configuration.

The customer is in this setup a Service Provider. They provide a student-administration application for the Dutch Higher Education Sector, like Colleges and Universities. The application conventionally is implemented on premise. But they like to move to a SaaS model. One institute is going to use the application from 'the cloud'. In the Dutch education sector, an organization called SurfConext serves as an authentication broker.

A good schematic explanation of the setup is in the Weblogic 11g docs:

When a user connects to the application, Weblogic finds that the user is not authenticated: it lacks a SAML2.0 token (2). So when configured correctly the browser is rerouted to SurfConext (3). On an authentication request SurfConext displays a so-called ‘Where Are You From’ (WAYF) page, on which a user can choose the institute to which he or she is connected. SurfConext then provides a means to enter the username and password (4). On submit SurfConext validates the credentials against the actual IdP, which is provided by the user’s institute (5). On a valid authentication, SurfConext provides a SAML2.0 token identifying the user with possible assertions (6). The page is refreshed and redirected to the landing page of the application (7).

For Weblogic SurfConext is in fact the Identity Profider, although in fact, based on the choice on the WAYF page, it reroutes the authentication request to the IdP of the particular institute.

Unfortunately I did not find a how-to of that particular setup in the docs. Although I found this.  But I did find the following blog:, that helped me much. Basically the setup is only the service provider part of that description.

So let me walk you through it. This is a larger blog, in fact I copy&paste larger parts from the configuration document I wrote for the customer

Configure Service provider


To be able to test the setup against a test-IdP of SurfConext the configured Weblogic need to be reachable from internet. Appropriate firewall and proxy-server configuration need to be done upfront to enable both SurfConext to connect to the Weblogic Server as well as a remote user.

All configuration regarding url’s need to be done using the outside url’s configured above.

A PC with a direct internet connection that is enabled to connect through these same URL’s is needed to test the configuration. When connecting a pc to the intranet of the customer enables the pc to connect to internet, but the internal network configuration prevented connecting to the weblogic server using the remote url’s.

During the configuration a so called SAML Metadata file is created. This file is requested by SurfConext to get acquainted with the Service Provider. This configuration can change through reconfigurations. So SurfConext requests this through a HTTPS url. This url need to be configured, and also remotely connectable. An option is the htdocs folder of a webserver that is connectable through https. In other SAML2 setups you might need to upload the metadata-file to the identity provider's server.

You also need the SAML metadata of SurfConext. It can be downloaded from:

Update Application

The application need to be updated and redeployed to use the weblogic authenticators instead of the native logon-form. To do so the web.xml need to be updated. In the web.xml (in the WEB-INF of the application war file) look for the following part:
And replace it with:
Repackage and redeploy the application to weblogic.

Add a new SAML2IdentityAserter

Here we start with the first steps to configure Weblogic: create a SAML2IdentityAsserter on the Service Provider domain.
  1. Login to ServiceProvider domain - Weblogic console
  2. Navigate to “Security Realms”:
  3.  Click on ”myrealm” 
  4. Go to the tab  ”Providers–>Authentication” :
  5. Add a new “SAML2IdentityAsserter”
  6. Name it for example: “SurfConextIdentityAsserter”:
  7. Click Ok, Save and activate changes if you're in a production domain (I'm not going to repeat that every time again in the rest of this blog). 
  8. Bounce the domain (All WLServers including AdminServer)

Configure managed server to use SAML2 Service Provider 

In this part the managed server(s) serving the application need to be configured for the so called 'federated services'. It need to know how to behave as a SAML2.0 Service Provider.
 So perform the following steps:
  1.  Navigate to the managed server, and select the “Federation Services–>SAML 2.0 Service Provider” sub tab:

  2. Edit the following settings:
  3. FieldValue
    Preferred BindingPOST
    Default URLhttp://hostname:portname/application-URI.
    This URL should be accessible from outside the organization, that is from SurfConext.
  4. Click Save.
  5. Navigate to the managed server, and select the “Federation Services–>SAML 2.0 General” sub tab:
  6. Edit the following settings:
  7. FieldValue
    Replicated Cache EnabledUncheck or Check if needed
    Contact Person Given NameEg. Jean-Michel
    Contact Person SurnameEg. Jarre
    Contact Person TypeChoose one from the list, like 'technical'.
    Contact Person CompanyEg. Darwin-IT Professionals
    Contact Person Telephone NumberEg. 555-12345
    Contact Person Email
    Organization NameEg. Hatseflats B.V.
    Published Site URL
    This URL should be accessible from outside the organization, that is from SurfConext. The Identity Provider needs to be able to connect to it.
    Entity IDEg.
    SurfConext expect an URI with at least a colon (‘:’), usually the URL of the SP.
    Recipient Check EnabledUncheck.
    When checked Weblogic will check the responding Url to the URL in the original request. This could result in a ‘403 Forbidden’ message.
    Single Sign-on Signing Key Aliasdemoidentity
    If signing is used the alias of the proper private certificate in the keystore that is configured in WLS is to be provided.
    Single Sign-on Signing Key Pass PhraseDemoIdentityPassPhrase
    Confirm Single Sign-on Signing Key Pass PhraseDemoIdentityPassPhrase
  8. Save the changes and export the IDP metadata into a XML file: 
    1. Restart the server
    2. Click on 'Publish Meta Data'
    3. Restart the server
    4. Click on 'Publish Meta Data'
    5. Provide a valid path, like /home/oracle/Documents/... and click 'OK'.
    6. Copy this to a location on a http-server that is remotely connectable through HTTPS and provide the url to SurfConext.

Configure Identity Provider metadata on SAML Service Provider in Managed Server

Add new “Web Single Sign-On Identity Provider Partner” named for instance "SAML_SSO_SurfConext".
  1. In Admin Console navigate to the myrealm Security Realm and select the “Providers–>Authentication
  2. Select the SurfConextIdentityAsserter SAML2_IdentityAsserter and navigate to the “Management” tab:
  3. Add a new “Web Single Sign-On Identity Provider Partner
    1. Name it: SAML_SSO_SurfConext
    2. Select “SurfConext-metadata.xml”
    3. Click 'OK'.
  4. Edit the created SSO Identity Provider Partner “SAML_SSO_SurfConext” and Provide the following settings:
  5. FieldValue
    DescriptionSAML Single Sign On partner SurfConext
    Redirect URIs/YourApplication-URI
    These are URI’s relative to the root of the server.

Add SAMLAuthenticationProvider

In this section an Authentication provider is added.
  1. Navigate to the ‘Providers->Authentication’ sub tab of the ‘myrealm’ Security Realm:
  2. Add a new Authentication Provider. Name it: ‘SurfConextAuthenticator’ and select as type: 'SAMLAuthenticator'.
    Click on the new Authenticator and set the Control Flag to ‘SUFFICIENT’:
  3. Return to the authentication providers and click on 'Reorder'.
    Use the selection boxes and the arrow buttons to reorder the providers as follows:
    The SurfConext authenticator and Identity Asserter should be first in the sequence.

Set all other authentication providers to sufficient

The control flag of the Default Authenticator is by default set to ‘REQUIRED’. That means that for an authentication request this one needs to be executed. However, for the application we want the SAMLAuthentication be Sufficient, thus that the other authenticators need not to be executed. So set these other ones (if others including the DefaultAuthenticator exist) to ‘SUFFICIENT’ as well.

Enable debug on SAML

To enable debug messages on SAML, navigate to the 'Debug' tab of the Managed Server:
Expand the nodes ‘weblogic -> security’. Check the node ‘Saml2’ and click 'Enable'. This will add SAML2 related logging during authentication processes to the server.log. To disable the logging, check the node or higher level nodes and click 'Disable'.

Deploy the Identity Name Mapper

SurfConnext generates a userid for each connected user. SurfConext provides two options for this: a persistent userid throughout all sessions or a userid per session. Either way, the userid is generated as a GUID that is not registered within the customers application and also on itself not to relate to known users in the application. In the SAML token however, also the username is provided. To map this to the actual userid that Weblogic provides to the application, an IdentityMapper class is needed. The class implements a certain interface of weblogic, and uses a custom principal class that implements a weblogic interface as well. The implementation is pretty straightforward. I found an example that uses an extra bean for a Custom Principal. The IdentityMapper class is as follows:
package nl.darwin-it.saml-example;



import java.util.ArrayList;
import java.util.Collection;
import java.util.logging.Logger;

import weblogic.logging.LoggingHelper;


public class SurfConextSaml2IdentityMapper implements SAML2IdentityAsserterNameMapper,
                                                      SAML2IdentityAsserterAttributeMapper {
  public static final String ATTR_PRINCIPALS = "com.bea.contextelement.saml.AttributePrincipals";
  public static final String ATTR_USERNAME = "urn:mace:dir:attribute-def:uid";

  private Logger lgr = LoggingHelper.getServerLogger();
  private final String className = "SurfConextSaml2IdentityMapper";

  public String mapNameInfo(SAML2NameMapperInfo saml2NameMapperInfo,
                            ContextHandler contextHandler) {
    final String methodName = className + ".mapNameInfo";
    String user = null;

          "saml2NameMapperInfo: " + saml2NameMapperInfo.toString());
    debug(methodName, "contextHandler: " + contextHandler.toString());
          "contextHandler number of elements: " + contextHandler.size());

    // getNames gets a list of ContextElement names that can be requested.
    String[] names = contextHandler.getNames();

    // For each possible element
    for (String element : names) {
      debug(methodName, "ContextHandler element: " + element);
      // If one of those possible elements has the AttributePrinciples
      if (element.equals(ATTR_PRINCIPALS)) {
        // Put the AttributesPrincipals into an ArrayList of CustomPrincipals
        ArrayList<CustomPrincipal> customPrincipals =
        int i = 0;
        String attr;
        if (customPrincipals != null) {
          // For each AttributePrincipal in the ArrayList
          for (CustomPrincipal customPrincipal : customPrincipals) {
            // Get the Attribute Name and the Attribute Value
            attr = customPrincipal.toString();
            debug(methodName, "Attribute " + i + " Name: " + attr);
                  "Attribute " + i + " Value: " + customPrincipal.getCollectionAsString());
            // If the Attribute is "loginAccount"
            if (attr.equals(ATTR_USERNAME)) {
              user = customPrincipal.getCollectionAsString();
              // Remove the "@DNS.DOMAIN.COM" (case insensitive) and set the username to that string
              if (!user.equals("null")) {
                user = user.replaceAll("(?i)\\@CLIENT\\.COMPANY\\.COM", "");
                debug(methodName, "Username (from loginAccount): " + user);

        // For some reason the ArrayList of CustomPrincipals was blank - just set the username to the Subject
        if (user == null || "".equals(user)) {
          user = saml2NameMapperInfo.getName(); // Subject = BRID

          debug(methodName, "Username (from Subject): " + user);

        return user;

    // Just in case AttributePrincipals does not exist
    user = saml2NameMapperInfo.getName(); // Subject = BRID
    debug(methodName, "Username (from Subject): " + user);


    // Set the username to the Subject
    return user;

    // debug(methodName,"com.bea.contextelement.saml.AttributePrincipals: " + arg1.getValue(ATTR_PRINCIPALS));
    // debug(methodName,"com.bea.contextelement.saml.AttributePrincipals CLASS: " + arg1.getValue(ATTR_PRINCIPALS).getClass().getName());

    // debug(methodName,"ArrayList toString: " + arr2.toString());
    // debug(methodName,"Initial size of arr2: " + arr2.size());

  /*  public Collection<Object> mapAttributeInfo0(Collection<SAML2AttributeStatementInfo> attrStmtInfos, ContextHandler contextHandler) {
      final String methodName = className+".mapAttributeInfo0";
        if (attrStmtInfos == null || attrStmtInfos.size() == 0) {
            debug(methodName,"CustomIAAttributeMapperImpl: attrStmtInfos has no elements");
            return null;

        Collection<Object> customAttrs = new ArrayList<Object>();

        for (SAML2AttributeStatementInfo stmtInfo : attrStmtInfos) {
            Collection<SAML2AttributeInfo> attrs = stmtInfo.getAttributeInfo();
            if (attrs == null || attrs.size() == 0) {
                debug(methodName,"CustomIAAttributeMapperImpl: no attribute in statement: " + stmtInfo.toString());
            } else {
            for (SAML2AttributeInfo attr : attrs) {
                if (attr.getAttributeName().equals("AttributeWithSingleValue")){
                        CustomPrincipal customAttr1 = new CustomPrincipal(attr.getAttributeName(), attr.getAttributeNameFormat(),attr.getAttributeValues());
                    String customAttr = new StringBuffer().append(attr.getAttributeName()).append(",").append(attr.getAttributeValues()).toString();
        return customAttrs;
    }       */

  public Collection<Principal> mapAttributeInfo(Collection<SAML2AttributeStatementInfo> attrStmtInfos,
                                                ContextHandler contextHandler) {
    final String methodName = className + ".mapAttributeInfo";
    Collection<Principal> principals = null;
    if (attrStmtInfos == null || attrStmtInfos.size() == 0) {
      debug(methodName, "AttrStmtInfos has no elements");
    } else {
      principals = new ArrayList<Principal>();
      for (SAML2AttributeStatementInfo stmtInfo : attrStmtInfos) {
        Collection<SAML2AttributeInfo> attrs = stmtInfo.getAttributeInfo();
        if (attrs == null || attrs.size() == 0) {
                "No attribute in statement: " + stmtInfo.toString());
        } else {
          for (SAML2AttributeInfo attr : attrs) {
            CustomPrincipal principal =
              new CustomPrincipal(attr.getAttributeName(),
            /*  new CustomPrincipal(attr.getAttributeName(),
                attr.getAttributeValues()); */
            debug(methodName, "Add principal: " + principal.toString());
    return principals;

  private void debug(String methodName, String msg) {
    lgr.fine(methodName + ": " + msg);

  private void debugStart(String methodName) {
    debug(methodName, "Start");

  private void debugEnd(String methodName) {
    debug(methodName, "End");

The commented method ‘public Collection<Object> mapAttributeInfo0’ is left in the source as an example method. The CustomPrincipal bean:
package nl.darwin-it.saml-example;

import java.util.Collection;
import java.util.Iterator;


public class CustomPrincipal extends WLSAbstractPrincipal implements WLSUser{
  private String commonName;  
      private Collection collection;  
    public CustomPrincipal(String name,  Collection collection) {  
        // Feed the Mandatory  
    public CustomPrincipal() {  
    public CustomPrincipal(String commonName) {  
    public void setCommonName(String commonName) {  
        // Feed the Mandatory  
        this.commonName = commonName;  
        System.out.println("Attribute: " + this.getName());  
        // System.out.println("Custom Principle commonName is " + this.commonName);  
    public Collection getCollection() {  
        return collection;  
    public String getCollectionAsString() {  
        String collasstr;  
        if(collection != null && collection.size()>0){  
            for (Iterator iterator = collection.iterator(); iterator.hasNext();) {  
                collasstr = (String);  
                return collasstr;  
        return "null";  
    public void setCollection(Collection collection) {  
        this.collection = collection;  
        // System.out.println("set collection in CustomPrinciple!");  
        if(collection != null && collection.size()>0){  
            for (Iterator iterator = collection.iterator(); iterator.hasNext();) {  
                final String value = (String);  
                System.out.println("Attribute Value: " + value);  
    public int hashCode() {  
        final int prime = 31;  
        int result = super.hashCode();  
        result = prime * result + ((collection == null) ? 0 : collection.hashCode());  
        result = prime * result + ((commonName == null) ? 0 : commonName.hashCode());  
        return result;  
    public boolean equals(Object obj) {  
        if (this == obj)  
            return true;  
        if (!super.equals(obj))  
            return false;  
        if (getClass() != obj.getClass())  
            return false;  
        CustomPrincipal other = (CustomPrincipal) obj;  
        if (collection == null) {  
            if (other.collection != null)  
                return false;  
        } else if (!collection.equals(other.collection))  
                return false;  
        if (commonName == null) {  
            if (other.commonName != null)  
                return false;  
        } else if (!commonName.equals(other.commonName))  
                return false;  
        return true;  
Package the classes as a java archive (jar) and place it in a folder on the weblogic server. For instance $DOMAIN_HOME/lib. Although the $DOMAIN_HOME/lib is in the classpath for many usages, for this usage the jar file is not picked-up by the class-loaders. Probably due to the class-loader hierarchy. To have the jar-file (SurfConextSamlIdentityMapper.jar) in the system class path, add the complete path to the jar file to the classpath on the Startup-tab on both the AdminServer as well as the Managed server. In this the AdminServer is needed, since class is configured through the Realm, and during the configuration the existence of the class is checked. Apparently it is required to also add the weblogic.jar before the SurfConextSamlIdentityMapper.jar to the startup-classpath. Then restart the AdminServer as well as the managed servers.

Configure the Identity Name Mapper

Now the Identity Name mapper class can be configured:
  1. In Admin Console navigate to the myrealm Security Realm and select the “Providers–>Authentication
  2. Select the SurfConextIdentityAsserter SAML2_IdentityAsserter and navigate to the “Management” tab:
  3. Edit the created SSO Identity Provider Partner “SAML_SSO_SurfConext”.

    Provide the following settings:
    Identity Provider Name Mapper Class Namenl.darwin-it.saml-example.SurfConextSaml2IdentityMapper

Test the application

At this point the application can be tested. Browse using the external connected PC to the application using the remote URL. For instance: If all is well, the browser is redirected to SurfConext’s WhereAreYouFrom page. Choose the following provider:

Connect as ‘student1’ with password ‘student1’ (or one of the other test creditials like student2, student3, see After a succesfull logon, the browser should be redirected to the application. The choosen credential should of course be known as a userid in the application.


This is one of the bigger stories on this blog. I actually edited the configuration document as a blog entry. I hope you'll find it usefull. With this blog you have a complete how-to for the ServiceProvider part for an ServiceProvider Initiated SSO setup.

SAML2 seemed complicated to me at first. And under the covers it still might be. But it turns out that Weblogic11g has a great implementation for it, that is neatly configurable. It's a little pity that you need a mapper class for the identity-mapping. It would be nice if you could configure the attribute-value to be returned as a userid. But the mapper class is not that complicated.

Thursday 10 April 2014

JDeveloper XSL Mapper tip

Of course you know already that in Jdeveloper you can create xsl maps, just by drawing lines between source and target elements. In many cases you need functions or complex expressions in between. Those are "drag-and-dropable" as well. I found that you can even drop a function on a line and the function will be added to the expression. So with a little thought of the sequence of "drag-and-drops" of functions you can assemble pretty complex expressions together just by using the mouse. Although I'm not affraid to hack back in the source code of the xsl for quickness, I found that this allowed me to spare a few switches between the Design and the Source tab. That is convenient, since hacking the source and switching back to the Design tab will cause the Designer to initialize again, driving you to expand again all the nodes you were working on. Understandable, but inconvenient with large XSD's.

What I did not know until recently is how to set a variable to an element. So what I did before was to hack in the source a piece of code like:
  <xsl:value-of select="$landCodeNL" />

It turns out that you can do that by "drag-and-drop" as well. In the component-palette you need to select the "Advanced" functions:
 At the bottom you find  a xpath-expression element. Drag-and-drop that in the design area and connect it to the target element.

When you edit it you can just type in your expression, for instance just a variable. When you start with a dollar sign, it even gives you a drop-down-list with available variables. Just pick the right one and our done.

I admit, no high-standard tip, but convenient enough though, for me at least.

Wednesday 9 April 2014

SQLServer: date conversions

In my current project I need to query an MS SqlServer database.
Unfortunately the dates are stored as a BigInt instead of a proper date datatype.
So I had to find out how to do compare the dates with the systemdate, and how to get the system date. To log this for possible later use, as an exception, a blog about SqlServer.

To get the system date, you can do:
It's maybe my Oracle background, but I would write this like:
An alternative is:
I found this at this blog. Contrary to the writer of that blog I would prefer this version, since I found that it works on Oracle too. There are several ways to convert this to a bigint, but the most compact I found is:
  ( SELECT  YEAR(DT)*10000+MONTH(dt)*100+DAY(dt) sysdateInt
  -- Test Data
  (SELECT  GETDATE() dt) a ) utl
The way I wrote this, makes it usefull as a subquery or a joined query:
  Ent.* ,
    WHEN Ent.endDate  IS NOT NULL
    AND Ent.endDate-1 < sysdateInt
    THEN Ent.endDate-1
    ELSE sysdateInt
  END refEndDateEntity ,
  SomeEntity Ent,
  ( SELECT  YEAR(DT)*10000+MONTH(dt)*100+DAY(dt) sysdateInt
  -- Test Data
  (SELECT  GETDATE() dt) a ) utl;
To convert a bigint to a date, you can do the following:
However, I found that although this works in a select clause, in the where-clause this would run into a "Data Truncation" error. Maybe it is due to the use of SqlDeveloper and thus a JDBC connection to SqlServer, but I'm not so enthousiastic about the error-responses of SqlServer... I assume the error has to do with the fact that it has to do with the fact that SqlServer has to interpret a column-value of a row when it did not already selected it, that is when evaluating wheter to add the row (or not) to the result set. So to make it work I added the construction as a determination value in the select clause of a 1:1 view on the table, and use that view in stead of the table. Then the selected value can be used in the where clause.