Friday, 9 February 2018

Weblogic 12c + SAML2: publish your metadata over an URL

This week I got to do a SAML2 implementation again for APEX against ADFS. Actually the same setup as last year. One pitfall I fell into with open eyes, was the Redirect URI on the 'Web SSO Partner Provider'. I entered /ords/f*, but it had to be with out the wild-card: /ords/f. But that aside.

At one step in the setup of a SAML2 configuration is that you have to publish the metadata, by clicking a button. Some SAML2 capabable middleware solutions can publish the metadata over an URL. ADFS does support a URL to get the metadata from the Service Provider, being Weblogic12c servicing your application. This prevents that you need to hand over the xml file every time you change/update your configuration. For instance because of expired certificates. How nice would it be if Weblogic supported this?

Well, actually, you can! Sort of... Weblogic does support to service a document-folder, like the htdocs folder of Apache. To do so, you need to create a war file, with only a weblogic.xml file that couples a context-root to a certain folder. And apparently Glassfish can do so too!

When you install ORDS on Weblogic, following the steps, you generate an i.war that is actually the example for this post. You could extract that file and adapt it for this purpose. But I wanted to be able to generate it. Doing so I could reuse this for several other purposes if I would need to.

So I started with a new Saml2MetaData project folder and created a src folder, with a WEB-INF folder beneath it.
Then I copied the three deployment descriptors:
  • sun-web.xml
  • web.xml
  • weblogic.xml
 The sun-web.xml (not being the travel company):
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE sun-web-app PUBLIC "-//Sun Microsystems, Inc.//DTD GlassFish Application Server 3.0 Servlet 3.0//EN" "http://www.sun.com/software/appserver/dtds/sun-web-app_3_0-0.dtd">
<sun-web-app>
 <!-- This element specifies the context path the static resources are served from --> 
 <context-root>${samlMetaData.contextRoot}</context-root>
 <!-- This element specifies the location on disk where the static resources are located -->
 <property name="alternatedocroot_1" value="from=/* dir=${samlMetaData.home}"/>
</sun-web-app>

As you can see I placed the ${samlMetaData.contextRoot} the property in the context-root-tag and the property named alternatedocroot_1 got the directory reference containing the ${samlMetaData.home}.

The web.xml is there for completeness, but does not contain a directory reference:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE web-app PUBLIC
"-//Sun Microsystems, Inc.//DTD Web Application 2.3//EN"
"http://java.sun.com/j2ee/dtds/web-app_2_3.dtd">
<web-app>
 <!-- This Web-App leverages the alternate doc-root functionality in WebLogic and GlassFish to serve static content 
      For WebLogic refer to the weblogic.xml file in this folder
      For GlassFish refer to the sun-web.xml file in this folder
  -->
</web-app>

And then the weblogic.xml: including the same properties referencing the context-root and folder:
<weblogic-web-app xmlns="http://www.bea.com/ns/weblogic/weblogic-web-app">
 <!-- This element specifies the context path the static resources are served from -->
 <context-root>${samlMetaData.contextRoot}</context-root>
 <virtual-directory-mapping>
  <!-- This element specifies the location on disk where the static resources are located -->
  <local-path>${samlMetaData.home}</local-path>
  <url-pattern>/*</url-pattern>
 </virtual-directory-mapping>
</weblogic-web-app>

Then I need an ANT build file that copies these files replacing the properties. I would have done it with WLST if I had found a way to wrap the lot into a war file, that quickly. But ANT does the job well. First I need a build.properties file, that denotes the properties values:
build.dir=${basedir}/build
dist.dir=${basedir}/dist
src.dir=${basedir}/src
samlMetaData.home=c:\\certs\\saml2
samlMetaData.contextRoot=/samlMetaData


And then the ANT build.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<project name="SamlMetaData" basedir="." default="build">
   <property file="build.properties" />
   <!-- Clean & Init -->
   <target name="clean">
      <echo>Delete build and dist folder</echo>
      <delete dir="${build.dir}" />
      <delete dir="${dist.dir}" />
   </target>
   <target name="init" depends="clean">
      <echo>Create build and dist folder</echo>
      <mkdir dir="${build.dir}" />
      <mkdir dir="${dist.dir}" />
   </target>
   <!-- war the project -->
   <target name="war">
      <property name="war.dir" value="${dist.dir}/${ant.project.name}" />
      <property name="war.file" value="${war.dir}/${ant.project.name}.war" />
      <echo>Create war file ${war.file} from ${build.dir}</echo>
      <mkdir dir="${war.dir}" />
      <jar destfile="${war.file}" basedir="${build.dir}">
         <manifest />
      </jar>
   </target>
   <!-- Build the war file -->
   <target name="build" depends="init">
      <echo>Copy ${src.dir} to  ${build.dir}, expanding properties</echo>
      <copy todir="${build.dir}">
         <fileset dir="${src.dir}" />
         <filterchain>
            <expandproperties />
         </filterchain>
      </copy>
      <ant target="war" />
   </target>
</project>

Run this with ANT andit will create a build and a dist  folder with the war file.
This can be deployed to Weblogic that results in a context root as configured in the build.properties. Everything placed in the folder as configured in the samlMetaData.home folder can be fetched through Weblogic.

So just publish your metadata to that folder and the IdentityProvider can get it auto-magically.
And of course you can use this for any other static file provisioning through Weblogic.

How to install the Notepad++ 64-bit plugin manager

I'm a Notepad++ fan for years. And as soon as a 64-bit version arose I adopted it.
But since a few months I have a new laptop, and I apparently didn't get the plugin manager with the latest new install.  And only now I took the opportunity to sort it out and write about it.

I found that the plugin manager is available since April 2017 on GitHub, version 1.49. I downloaded the zip from the mentioned location, I choose the _x64 version:
Then unzipped it into my Notepad++ folder:

Then started Notepad++ and the plugin manager appears:

So now I can format my XML files again...

Tuesday, 23 January 2018

SoapUI: validate a date field in response with current date

Once in a while you need to validate a service that has dates in the response. Although SoapUI has xpath and xquery match assertions, validate against strings is quite difficult. How to do a date comparison against for instance the current date?

You can do it with a script assertion:
And the content of this can be:
def groovyUtils = new com.eviware.soapui.support.GroovyUtils(context)
// Set Namespaces
def holder = groovyUtils.getXmlHolder(messageExchange.responseContent)
//holder.namespaces["soapenv"] = "http://schemas.xmlsoap.org/soap/envelope/"
def dateFoundStr = holder.getNodeValue("/Results/ResultSet/Row[1]/DATE_FOUND")
def dateFound = new Date().parse("yyyy-MM-dd hh:mm:ss", dateFoundStr)
dateFoundStr = dateFound.format("yyyy-MM-dd")
//Current Date
def date = new Date()
def currentDate=date.format("yyyy-MM-dd")
//
assert dateFoundStr == currentDate

First we need to fetch and parse the response content using the holder variable, parsing the messageExchange.responseContent using groovyUtils.getXmlHolder.

Second, the particular date is found, here dateFound as a field from a JDBC response. A JDBC response does not have namespaces, but from a SOAP response it helps to declare namespaces. For an example see the commented line for holder.namespaces["soapenv"].

Third, I parse the found date, which is a string as fetched from the xml, to a date time, then format it to a string to get only the date part. This could be done simply using substring methods, but I wanted to try this. And get and formatthe currentDate as a string.

In the end just do assert with a comparison of both values.

There you go.

Monday, 22 January 2018

Modify your nodemanager.properties in wlst

In 2016 I did several posts on automatic installs of Fusion MiddleWare, including domain creation using wlst.

With weblogic 12c you automatically get a pre-configured per-domain nodemanager. But you might find the configuration not completely suiting your whishes.

It would be nice to update the nodemanager.properties file to with your properties in the same script.

Today I started with upgrading our Weblogic Tuning and Troubleshooting training to 12c, and one of the steps is to adapt the domain creation script. In the old script, the AdminServer is started right way, to add the managed server to the domain. In my before mentioned script, I do that offline. But since I like to be able to update the nodemanager.properties file I figured that out.

Earlier, I created  a function to just write a new property file:
#
# Create a NodeManager properties file.
def createNodeManagerPropertiesFile(javaHome, nodeMgrHome, nodeMgrType, nodeMgrListenAddress, nodeMgrListenPort):
  print ('Create Nodemanager Properties File for home: '+nodeMgrHome)
  print (lineSeperator)
  nmProps=nodeMgrHome+'/nodemanager.properties'
  fileNew=open(nmProps, 'w')
  fileNew.write('#Node manager properties\n')
  fileNew.write('#%s\n' % str(datetime.now()))
  fileNew.write('DomainsFile=%s/%s\n' % (nodeMgrHome,'nodemanager.domains'))
  fileNew.write('LogLimit=0\n')
  fileNew.write('PropertiesVersion=12.2.1\n')
  fileNew.write('AuthenticationEnabled=true\n')
  fileNew.write('NodeManagerHome=%s\n' % nodeMgrHome)
  fileNew.write('JavaHome=%s\n' % javaHome)
  fileNew.write('LogLevel=INFO\n')
  fileNew.write('DomainsFileEnabled=true\n')
  fileNew.write('ListenAddress=%s\n' % nodeMgrListenAddress)
  fileNew.write('NativeVersionEnabled=true\n')
  fileNew.write('ListenPort=%s\n' % nodeMgrListenPort)
  fileNew.write('LogToStderr=true\n')
  fileNew.write('weblogic.StartScriptName=startWebLogic.sh\n')
  if nodeMgrType == 'ssl':
    fileNew.write('SecureListener=true\n')
  else:
    fileNew.write('SecureListener=false\n')
  fileNew.write('LogCount=1\n')
  fileNew.write('QuitEnabled=true\n')
  fileNew.write('LogAppend=true\n')
  fileNew.write('weblogic.StopScriptEnabled=true\n')
  fileNew.write('StateCheckInterval=500\n')
  fileNew.write('CrashRecoveryEnabled=false\n')
  fileNew.write('weblogic.StartScriptEnabled=true\n')
  fileNew.write('LogFile=%s/%s\n' % (nodeMgrHome,'nodemanager.log'))
  fileNew.write('LogFormatter=weblogic.nodemanager.server.LogFormatter\n')
  fileNew.write('ListenBacklog=50\n')
  fileNew.flush()
  fileNew.close()

But this one just rewrites the file, and so I need to determine the values for properties like DomainsFile, JavaHome, etc., which are already set correctly in the original file. I only want to update the ListenAddress, and ListenPort, and possibly the SecureListener property based on the nodemanager type. Besides that, I want to backup the original file as well.

So, I adapted this  function to:
#
# Update the Nodemanager Properties
def updateNMProps(nmPropertyFile, nodeMgrListenAddress, nodeMgrListenPort, nodeMgrType):
  nmProps = ''
  print ('Read Nodemanager properties file%s: ' % nmPropertyFile)
  f = open(nmPropertyFile)
  for line in f.readlines():
    if line.strip().startswith('ListenPort'):
      line = 'ListenPort=%s\n' % nodeMgrListenPort
    elif line.strip().startswith('ListenAddress'):
      line = 'ListenAddress=%s\n' % nodeMgrListenAddress
    elif line.strip().startswith('SecureListener'):
       if nodeMgrType == 'ssl':
         line = 'SecureListener=true\n'
       else:
         line = 'SecureListener=false\n'
    # making sure these properties are set to true:
    elif line.strip().startswith('QuitEnabled'):
      line = 'QuitEnabled=%s\n' % 'true'
    elif line.strip().startswith('CrashRecoveryEnabled'):
      line = 'CrashRecoveryEnabled=%s\n' % 'true'
    elif line.strip().startswith('weblogic.StartScriptEnabled'):
      line = 'weblogic.StartScriptEnabled=%s\n' % 'true'
    elif line.strip().startswith('weblogic.StopScriptEnabled'):
      line = 'weblogic.StopScriptEnabled=%s\n' % 'true'         
    nmProps = nmProps + line
  # Backup file
  print nmProps
  nmPropertyFileOrg=nmPropertyFile+'.org'
  print ('Rename File %s to %s ' % (nmPropertyFile, nmPropertyFileOrg))
  os.rename(nmPropertyFile, nmPropertyFileOrg)  
  # Save New File
  print ('\nNow save the changed property file to %s' % nmPropertyFile)
  fileNew=open(nmPropertyFile, 'w')
  fileNew.write(nmProps)
  fileNew.flush()
  fileNew.close()
It first reads the property file, denoted with nmPropertyFile line by line.
If a line starts with a particular property that I want to set specifically, then the line is replaced. Each line is then added to the nmProps  variable. For completeness and validation I print the resulting variable.
Then I rename the original file to nmPropertyFile+'.org' using os.rename(). And lastly, I write the contents of the nmProps to the original file in one go.


This brings me again one step further to a completely scripted domain.

Thursday, 21 December 2017

Run SQLcl from ANT

I think since last year ORacle released SQLcl which could be seen as the commandline variant of SQL Developer. But even better: a replacement of SQL Plus.

A few years ago I created what I called a InfraPatch framework, to do preparations on an infrastructure as a pre-requisite for the deployment of services and/or applications. It can run WLST scripts for creating datasouces, jms-queues, etc.  It also supported the running of database scripts, but it required an sqlplus installation, for instance using the instant client. Since it was part of a release/deploy toolset, where the created release is to be deployed by an IT admin on a test, acceptance or production environment, I had to rely on a correct Oracle/instant client installation on an agreed location.

I'm in the process of revamping that framework and renamed to InfraPrep,  since preparing an infrastructural environment makes it more clear what it does. (It does not patch a system with Oracle patches...).

Now I'm at the point that I have to implement the support of running database scripts. The framework uses ANT, which in fact is Java. And SQLcl has two big advantages that makes it ideal for me to use in my InfraPrep framework:
  • It is incredibly small: it's only 19MB! And that includes the ojdbc and xmlparser jars. Since i used ANT from a FusionMiddleWare home, I could make it even smaller! 
  • It is Java, so I can leverage the java ant task.
 So, how to call SQLcl from ANT? I need a few ingredients:
  • Download and unzip SQLcl into my Ant project and add a sqlcl.home property:
    sqlcl.home=${basedir}/sqlcl
  • The actual sqlcl jar file and add the sqlcl.jar property for that:
    sqlcl.jar=oracle.sqldeveloper.sqlcl.jar
  • The main class file = oracle.dbtools.raptor.scriptrunner.cmdline.SqlCli
These ingredients can be found in the sql.bat in the bin folder of the SQLcl download.

Then of course in my environment property file I need the user name, password and database url.
Something like:
DWN.dbUrl=(description=(address=(host=darlin-vce-db.org.darwinit.local)(protocol=tcp)(port=1521))(connect_data=(service_name=orcl)))
DWN.dbUserName=dwn_owner
DWN.dbPassword=dwn_owner

I used a TNS-style database URL, since it is the same as used in the creation of the corresponding DataSource. And it can be reused to connect with SQLcl.

Now, to make it easier to use and to abstract the plumbing in a sort of  ANT task, I crated a macrodef:


 <!-- Create Add Outbound connection pool to DB adapter-->
  <macrodef name="runDbScript">
    <attribute name="dbuser"/>
    <attribute name="dbpassword"/>
    <attribute name="dburl"/>
    <attribute name="dbscript"/>
    <sequential>
      <logMessage message="DatabaseUrl: @{dburl}" level="info"/>
      <logMessage message="DatabaseUser: @{dbuser}" level="info"/>
      <logMessage message="DatabasePassword: ****" level="info"/>
      <property name="dbConnectStr" value='@{dbuser}/@{dbpassword}@"@{dburl}"'/>
      <property name="dbScript.absPath" location="@{dbscript}"/>
      <property name="dbScriptArg" value="@${dbScript.absPath}"/>
      <logMessage message="Run Database script: ${dbScriptArg}" level="info"/>
      <record name="${log.file}" action="start" append="true"/>
      <java classname="oracle.dbtools.raptor.scriptrunner.cmdline.SqlCli" failonerror="true" fork="true">
        <arg value="${dbConnectStr}"/>
        <arg value="${dbScriptArg}"/>
        <classpath>
          <pathelement location="${sqlcl.home}/lib/${sqlcl.jar}"/>
        </classpath>
      </java>
      <record name="${log.file}" action="stop"/>
    </sequential>
  </macrodef>
</project>

In this macrodefinition, I first build up a database connect string using the username, password and database url:
      <property name="dbConnectStr" value='@{dbuser}/@{dbpassword}@"@{dburl}"'/>
Then I use a little trick to create an absolute path of the dbscript path:
      <property name="dbScript.absPath" location="@{dbscript}"/>
The trick is in the location attribute of the property.
And since that now is a property instead of an attribute, I circumvented the need for escaping the @ character:
      <property name="dbScriptArg" value="@${dbScript.absPath}"/>
The logmessage task you see is another macrodef I use:
      <macrodef name="logMessage">
            <attribute name="message" default=""/>
            <attribute name="level" default="debug"/>
            <sequential>
                  <echo message="@{message}" level="@{level}"></echo>
                  <echo file="${log.file}" append="true"
                        message="@{message}${line.separator}" level="@{level}"></echo>
            </sequential>
      </macrodef>

It both echo's the output to the console and to a log file.
Since I want the output of the java task into the same log file, I enclosed the java task with record tasks to start and stop the appending of the output-stream to the log file.

The java task is pretty simple, referencing the jar file in the classpath and providing the connect string and the script run argument as two separate arguments.
There are however two important properties:
  • failonerror="true": I want to quit my ANT scripting when the database script fails.
  • fork="true": when providing the exit statement in the sql script, SQLcl tries to quit the JVM. This is not allowed, because it runs by default in the same JVMas ANT. Not providing the exit statement in the script will keep the thread in SQLcl, which is not acceptable. So, forking the JVM will allow SQLcl to quit properly.
Now, the macro can be called as follows:
    <propertycopy name="dbUser" from="${database}.dbUserName"/>
    <propertycopy name="dbUrl" from="${database}.dbUrl"/>
    <propertycopy name="dbPassword" from="${database}.dbPassword"/>
    <runDbScript dbuser="${dbUser}" dbpassword="${dbPassword}" dburl="${dbUrl}" dbscript="${prep.folder}/${dbScript}"/>

Where these properties are used:
database=DWN
dbScript=sample.sql

Ant the sample.sql file:
select * from global_name;
exit;

And this works like a charm:
runPrep:
     [echo] Script voor uitvoeren van database script.
     [echo] Environment:
     [echo] Prep folder: ../../infraPreps/BpmDbS0004
     [echo] Load prep property file ../../infraPreps/BpmDbS0004/BpmDbS0004.properties
     [echo] Run Script
     [echo] DatabaseUrl: (description=(address=(host=darlin-vce-db.org.darwinit.local)(protocol=tcp)(port=1521))(connect_data=(service_name=orcl)))
     [echo] DatabaseUser: dwn_owner
     [echo] DatabasePassword: ****
     [echo] Run Database script: @c:\temp\FMWReleaseAll\DWN\1.0.0\infraPreps\BpmDbS0004\sample.sql
     [java]
     [java] SQLcl: Release 17.3.0 Production on do dec 21 11:18:50 2017
     [java]
     [java] Copyright (c) 1982, 2017, Oracle.  All rights reserved.
     [java]
     [java] Connected to:
     [java] Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
     [java] With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP,
     [java] Data Mining and Real Application Testing options
     [java]
     [java]
     [java] GLOBAL_NAME
     [java] --------------------------------------------------------------------------------
     [java] ORCL
     [java]
     [java]
     [java] Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
     [java] With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP,
     [java] Data Mining and Real Application Testing options
     [echo] Done running preperations.

BUILD SUCCESSFUL
Total time: 12 seconds

One thing to be arranged though is the fetch of the username/password from the commandline, instead of properties. This can be as follows:
    <input message="Enter database user for environment ${database}: " addproperty="db.user"/>
    <input message="Enter password for user ${db.user}: " addproperty="db.password">
      <handler classname="org.apache.tools.ant.input.SecureInputHandler"/>
    </input>

Conclusion

SQLcl is great, since it is small and in java. So it turns out incredibly easy to distribute it within your own framework.

Wednesday, 20 December 2017

OSB 12c Customization in WLST, some new insights: use the right jar for the job!

Problem setting  and investigation

Years ago I created a Release & Deploy framework for Fusion Middleware, also supporting Oracle Service Bus. Recently revamped it to use 12c. It uses WLST to import the OSB service to the Service Bus, including the execution the customization file.

There are lots of examples to do this, but I want to zoom in on the execution of the customization file.

The WLST function that does this, that I use is as follows:
#=======================================================================================
# Function to execute the customization file.
#=======================================================================================
def executeCustomization(ALSBConfigurationMBean, createdRefList, customizationFile):
    if customizationFile!=None:
      print 'Loading customization File', customizationFile
      inputStream = FileInputStream(customizationFile)
      if inputStream != None:
        customizationList = Customization.fromXML(inputStream)
        if customizationList != None:
          filteredCustomizationList = ArrayList()
          setRef = HashSet(createdRefList)
          print 'Filter to remove None customizations' 
          print "-----"
          # Apply a filter to all the customizations to narrow the target to the created resources
          print 'Number of customizations in list: ', customizationList.size()
          for customization in customizationList:
            print "Add customization to list: "
            if customization != None:
              print 'Customization: ', customization, " - ", customization.getDescription()
              newCustomization = customization.clone(setRef)
              filteredCustomizationList.add(newCustomization)
            else:
              print "Customization is None!"
            print "-----"
          print 'Number of resulting customizations in list: ', filteredCustomizationList.size()
          ALSBConfigurationMBean.customize(filteredCustomizationList)
        else:
          print 'CustomizationList is null!'
      else:
        print 'Input Stream for customization file is null!'
    else:
      print 'No customization File provided, skip customization.'

The parameter ALSBConfigurationMBean can be fetched with:
...
        sessionName = createSessionName()
        print 'Created session', sessionName
        SessionMBean = getSessionManagementMBean(sessionName)
        print 'SessionMBean started session'
        ALSBConfigurationMBean = findService(String("ALSBConfiguration.").concat(sessionName), "com.bea.wli.sb.management.configuration.ALSBConfigurationMBean")
...

The other parameter is the createdRefList, that is build up from the default ImportPlan during import of the config jar:
...
          print 'ÒSB project', project, 'will get updated'
            osbJarInfo = ALSBConfigurationMBean.getImportJarInfo()
            osbImportPlan = osbJarInfo.getDefaultImportPlan()
            osbImportPlan.setPassphrase(passphrase)
            operationMap=HashMap()
            operationMap = osbImportPlan.getOperations()
            print
            print 'Default importPlan'
            printOpMap(operationMap)
            set = operationMap.entrySet()

            osbImportPlan.setPreserveExistingEnvValues(true)

            #boolean
            abort = false
            #list of created artifact refences
            createdRefList = ArrayList()
            for entry in set:
                ref = entry.getKey()
                op = entry.getValue()
                #set different logic based on the resource type
                type = ref.getTypeId
                if type == Refs.SERVICE_ACCOUNT_TYPE or type == Refs.SERVICE_PROVIDER_TYPE:
                    if op.getOperation() == ALSBImportOperation.Operation.Create:
                        print 'Unable to import a service account or a service provider on a target system', ref
                        abort = true
                else:
                    #keep the list of created resources
                    print 'ref: ',ref
                    createdRefList.add(ref)
            if abort == true :
                print 'This jar must be imported manually to resolve the service account and service provider dependencies'
                SessionMBean.discardSession(sessionName)
                raise
            print
            print 'Modified importPlan'
            printOpMap(operationMap)
            importResult = ALSBConfigurationMBean.importUploaded(osbImportPlan)
            printDiagMap(importResult.getImportDiagnostics())              
            if importResult.getFailed().isEmpty() == false:
                print 'One or more resources could not be imported properly'
                raise
...

The meaning is to build up a set of references of created artefact, to narrow down the customizations to only execute them on the artefacts that are actually imported.

Now, back to the executeCustomization function. It first creates an InputStream on the customization file:
inputStream = FileInputStream(customizationFile)

on which it builds a list of customizations using the .fromXML method of the Customization object:
        customizationList = Customization.fromXML(inputStream)

These customizations are interpreted from the Customization file. If you open that you can find several customization elements:
 <cus:customization xsi:type="cus:EnvValueActionsCustomizationType">
        <cus:description/>
...
    <cus:customization xsi:type="cus:FindAndReplaceCustomizationType">
        <cus:description/>
...
    <cus:customization xsi:type="cus:ReferenceCustomizationType">
        <cus:description/>


These all are mapped to subclasses of the Customization. And now the reason that I write this blogpost is that I ran into a problem with my import tooling. In the EnvValueActionsCustomizationType the endpoint replacements for the target environments is done. And the weren't executed. In fact these customizations were in the customizationList, but as a None/Null object. Thus, executing this complete list using ALSBConfigurationMBean.customize(filteredCustomizationList) would run in an exception, refering to a null object in the customization list. That's why they're filtered out. But why weren't these interpreted by the .fromXml() method?

Strangely enough in the javaAPI docs of 12.2.1 the EnvValueActionsCustomization does not exist, but the EnvValueCustomization does. But searching My Oracle Support shows in Note 1679528.2: 'A new customization type EnvValueActionsCustomizationType is available in 12c which is used when creating a configuration plan file.' and here in the Java API doc (click on com.bea.wli.config.customization) it is stated that EnvValueCustomization is deprecated and EnvValueActionsCustomization should be used in stead.
Apparently the docs is not updated completely....
And it seems that I used a wrong jar file: The customization file was created using the Console, and executing the customization file using the console did execute the endpoint replacements. So I figured that I must be using a wrong version of the jar file.
So I searched on my BPM quickstart installation (12.2.1.2) for the class EnvValueCustomization:
Jar files containing EnvValueCustomization
  • C:\Oracle\JDeveloper\12210_BPMQS\osb\lib\modules\oracle.servicebus.configfwk.jar/com\bea\wli\config\customization\EnvValueCustomization.class
  • C:\Oracle\JDeveloper\12210_BPMQS\oep\spark\lib\spark-osa.jar/com\bea\wli\config\customization\EnvValueCustomization.class
  • C:\Oracle\JDeveloper\12210_BPMQS\oep\common\modules\com.bea.common.configfwk_1.3.0.0.jar/com\bea\wli\config\customization\EnvValueCustomization.class
And then I did a search with EnvValueActionsCustomization.
Jar files containing EnvValueActionsCustomization:
  • C:\Oracle\JDeveloper\12210_BPMQS\osb\lib\modules\oracle.servicebus.configfwk.jar/com\bea\wli\config\customization\EnvValueActionsCustomization.class

Solution

It turns out that in my ANT script I used:
<path id="library.osb">
  <fileset dir="${fmw.home}/oep/common/modules">
     <include name="com.bea.common.configfwk_1.3.0.0.jar"/>
  </fileset> 
  <fileset dir="${weblogic.home}/server/lib">
    <include name="weblogic.jar"/>
    <include name="wls-api.jar"/>
  </fileset>
  <fileset dir="${osb.home}/lib">
    <include name="alsb.jar"/>
  </fileset>
</path>

Where I should use:
<path id="library.osb">
  <fileset dir="${fmw.home}/osb/lib/modules">
    <include name="oracle.servicebus.configfwk.jar"/>
  </fileset>
  <fileset dir="${weblogic.home}/server/lib">
    <include name="weblogic.jar"/>
    <include name="wls-api.jar"/>
  </fileset>
  <fileset dir="${osb.home}/lib">
    <include name="alsb.jar"/>
  </fileset>
</path>

Conclusion

It took me quite some time to debug this. But learned how the customization works. I found quite some examples that use com.bea.common.configfwk_1.X.0.0.jar. And apparently during my revamping, I updated this class path (actually I had 1.7, and found only 1.3 in my environment).  But, somehow Oracle found it sensible to replace it with oracle.servicebus.configfwk.jar while keeping the old jar files.
So use the right Jar for the job!

Monday, 18 December 2017

Create the SOA/BPM Demo User Community, with just WLST.

As said in my previous post (I've learned somewhere you should not post twice on the same day, but spread it out over time), I'm delivering a BPM 12c training. And based it on the BPM Quickstart. Although nice for UnitTests and development, the integrated weblogic lacks a  proficient set of users to test your task definitions.

Oracle has a demo community and a set of ANT and Servlet based scripts to povision your SOA or BPMSuite environment with a set of American literature writers, to be used in demo's and trainings. I some how found this years ago and had it debugged to be used in 12.1.3 a few years ago. However, I did not know where I got it and if it was free to be delivered.

Apparently it is, and you can find it at my oracle support. Our friends with Avio Consulting also  did a good job in making it available and working with 12c. However, I could not make it work smoothly end 2 end. I got it seeded, but figured that I would not need ANT and a Servlet.

Last year, in 2016, I created a bit of WLST scripting to create users for OSB and have them assigned to OSB Application roles. You can read about that here for the user creation, and here for the app-role assignment.

One thing that's missing in those scripts is the setting of the user attributes. So I googled around and found a means to add those too.

First, I had to transform the demo community seeding xml file to a property file. Like this:

#
cdickens.password=welcome1
cdickens.description=Demo User
cdickens.email=cdickens@emailExample.com
cdickens.title=CEO
cdickens.firstName=Charles
cdickens.lastName=Dickens
cdickens.timeZone=America/Los_Angeles
cdickens.languagePreference=en-US
cdickens.workPhone=100000001
cdickens.homePhone=200000001
cdickens.mobile=300000001
cdickens.im=jabber|cdickens@exampleIM.com

The complete, usersAndGroups.properties file is available here.
In an earlier blog, I wrote about how to read a property file. But my prefered method, does not allow me to determine the property to be fetched dynamically. That's why I split my basic createDemoUsers.properties file, that refers to the usersAndGroups.properties file, and contains the properties refering the Oracle/Jdeveloper home and the connection details for the AdminServer. This property file also contains comma separated lists of users, groups and AppRoles to be created or  granted.

The actual createDemoUsers.py file loops over the three lists and creates the particular users and groups, and grants the AppRoles.

To set the attributes, the setUserAttributeValue of the authenticatorMBean can be used as follows:
    #Set Properties
    firstName=userProps.getProperty(userName+".firstName")
    lastName=userProps.getProperty(userName+".lastName")
    displayName=nvl(firstName, " ")+" "+nvl(lastName, " ")
    authenticator.setUserAttributeValue(userName,"displayName",displayName.strip())

I published the complete set of scripts on the GitHub repo I shared with my colleague.
You can download them all, adapt the createDemoUsers.sh, to refer the correct MW_HOME to your JDeveloper environment. For Windows you might translate it to a .bat/.cmd file.

And of course you can use it for your own set of users.

I think I covered near to all of the Demo User community. Except for management-chains: I could not find how to register a manager for a user in Weblogic. Neither in the console, nor inWLST. So, I currently I conclude it cannot be done. But, if you have a tip, please be so good to leave a comment. I would highly appreciate it.