A first glance at Mule’s API capabilities

Since 2006, Mulesoft is offering middleware and messaging. Back then, building integrations was still heavily XML file based, with a set of Eclipse based plugins. This has changed though! With their Anypoint Studio, an Eclipse-based graphical development environment for designing, testing and running Mule flows, they really upped their game. Together with their new IPaaS platform CloudHub and their API capabilities, Mulesoft is looked at as a big player in IPaaS and API arena as shown in a previous article here where both Gartner and Ovum rate them as leaders. To add to that, Mulesoft was placed in the Forbes Cloud top 100 at number 20.

mulesoft

As I was quite curious how the new studio compared to my daily integration work with JDeveloper, I downloaded it and gave it a swing with a simple integration. You can download Anypoint Studio right here.

I started of by creating a simple mule project. File > New > Mule Project and clicking finish. As you can see, you can also use Maven but I will leave this unchecked for now.

MuleProject

After clicking finish you will see that it generates a project structure on the left side:

MuleProjectExplorer

In the middle is the canvas on which you can drag and drop items from the palette which is on the right. Straight away you can see there are quite a few connector which come out-of-the-box.

MuleProjectConnectors

I will start of with a HTTP connector. You can search in palette by typing HTTP. Now just drag the component onto the canvas. When you select the component, you can see the properties at the bottom. I will just keep the HTTP Display Name but I will set the Path to /simplemuleservice. Then I will need to generate a listener configuration by clicking the + sign on the right. I will accept all the default values which are 0.0.0.0 at port 8081. Next I am going to add a simple response. Look for Set Payload in the palette and drag it in the box next to the HTTP connector. Now select it and input a value. Mine is Hello…this is a simple Mule service.

setpayload

Next lets add some Logging. Add the Logger component and add it to the canvas. In the Message property of the Logger we can state what we want to log. The settings for the logging are defined in the log4j2.xml under resources. You can also use the Mule expression language here to access all sorts of info. I will just log all of the parameters send with the request by adding the next message: My simple mule service called with parameter #[message.inboundProperties.'http.query.params']

logger

Now save the whole project. Lets see if it also runs. Just right click the application a choose > Run as > Mule application.

You can see the application is running.

running

Now open your browser and type in server and port you defined. In my case http://localhost:8081/simplemuleservice. As I also added the parameters to the Logger step, I will also add some of those like this: http://localhost:8081/simplemuleservice?caller=Hugo&message=Hi

As you will see the browser will respond with the output we defined.

browser

And the logging shows:
loggerOutput

As you can see, creating a flow is much easier then before. Underneath it is still al XML but the visual wrapper around it makes it easier to work with in my opinion. This was actually a very simple flow. The next step is to do something which is a bit more work but more like an actual usecase.

Lets say we have a SOAP service running on-premise. We want to expose that service as an API to the outside world. So basically 2 things:

  • Design the API first
  • Convert REST to SOAP and call the webservice
  • Convert the SOAP response back to a JSON result and return it

I want to make an API for a webservice which I have used before…..the ConversionRateService. It has 2 input values. FromCurrency and ToCurrency. The result is ConversionRate result. So let’s start with creating a simple RAML file. Mine looks like this:

#%RAML 1.0
title: Conversionrate API
version: v1.0
baseUri: http://conversionrate/api
mediaType: application/json
documentation:
  - title: Introduction
    content: |
      API to lookup currency conversion rates
/convert:
  get:
    responses:
      200:
        body:
          example: |
            {
                "conversionrate" : "1.2345",
                
            }

Now that we have our RAML file, start a new project, in my case simplemule2. Don’t forget to tick the Add APIkit components box below and select the RAML file we just created.

newProject

When you click finish, Anypoint Studio will generate skeletal backend flows, based on the RAML file. The next step is to delete the Set Payload in the get:/convert:conversionrate-config. This is the flow in which we will call our webservice. Drag the Web Service Consumer component onto the canvas. Then down below we can create a Connector Configuration.

webserviceconsumer. I used the WSDL from an old project here.

Now that I have the call to the webservice, I have to map the rest call to soap. Drag a Transform Message before the Web Service Consumer component. If you look at the properties below, you can see on the left side the Input, and on the right side the output. As the input parameters will be incoming as http query parameters, we can just type in the mapper file on the right, how we want to map them.

mapRequest

Next we want to transform the result from SOAP to a JSON response so drag another Transform Message but this time place it after the Web Service Consumer component. Again look at the properties below. On the left side you can the the data model of the webservice response. The right side Output shows unknown. As I just want to map the response, you can edit the result on the right side like shown below:

mapResponse

As you can see, we will return a simple JSON response in which we map the conversionrate field to the webservice response field ConversionRateResult. Save it and we are ready to test the application. To make it easy for myself, I have created the conversionrate webservice using SoapUI by importing the WSDL and creating a Mock service. It is running on http://localhost:8005/mockConversionRateService as defined in the Web Service Consumer component.

Now right click the application and choose Run as–> Mule application. As you can see it will also start the APIkit console automatically. It shows the description of the API and an interface to make calls. I will just use a good old browser call like this: http://localhost:8081/api/convert?FromCurrency=EUR&ToCurrency=USD which result in

mockResponse

As you can see, it hits the SoapUI mock which returns a SOAP response which is processed by Mule which returns a JSON response.

restResponse

I was surprised how easy it was to create an API design first. The UI works nice and easy and everything is deployed nice and fast on the integrated Mule server. I have just scratched the surface here of Mule’s capabilities but I am liking what I am seeing.

SOA Suite 12C : Patch to fix the JDev business rule component performance

A while back I did a post about the business rule engine. See here. In the end of the post I added some considerations. One was the very annoying performance of the business rule component in JDev when using decision tables. If you were editing them, the memory usage started growing and after 5 minutes JDev would crash. Also editing large decision tables was very slow. The worst part back then was that the export and import to excel was also broken.

performance

A while back there was a fix for that but recently we found out that another JDev ADF patch major increases the usability of decision tables.

We applied the next 3 patches but the last one really did the job.

  • Performance Patch (23754944)
  • Bundle Patch 161018 (20163149)
  • JDev ADF Security Patch (23754311)

Don’t forget to start JDev with the -clean parameter once to get a fresh start.

So next time when you start pulling out your hairs when editing a large decision table…..think OPatch 23754311!

Integration Platforms as a Service in 2016.

Working in the integration business, you probably noticed quite an increasingly upcoming market of data sources which are no longer on premise. With more and more PaaS and SaaS solutions, the need to integrate them becomes obvious. Salesforce should talk to your on-premise CRM system and your custom applications should update your Twitter feed for example. Back in the day, you would get a piece of software usually called something with Gateway in the name. It would be able connect your internal enterprise software to the evil and scary outside world and usually do something extra like authentication, authorisation and monitoring. But integrating with Salesforce is different then integrating with SAP so you have to build quite a lot of custom software which is expensive. Also you had configure every new connection to the outside world as you don’t want unwanted people fiddling around with your internal services.

How handy would it be, if there was a platform which was located in the cloud and which was able to connect to SaaS vendors out of the box? Then you just have to make 1 connection from your company network to the platform and every data stream could be configured through there.
Well ladies and gentlemen, meet iPaaS aka Integration Platform as a Service.

iPaaS

At this moment there are quite some vendors which offer an iPaaS solution. You have the obvious big boys such as Oracle, Microsoft and IBM but also some lesser software behemoths which according to Forrester, Gartner and Ovum are taking the cake. Lets look at the reports for 2016.

reports

In the left graph, you can see the Forrester Wave.
In the middle graph you see the Gartner Magic quadrant and in the right one you see the Ovum decision matrix

As an Integration consultant working with Oracle software I was surprised Oracle wasn’t present in 2 of the 3 reports. Oracle’s Integration Cloud Service was identified by Gartner as a visionary but they aren’t in the magic quadrant as they seem to lack ability to execute at the moment.

Dell Boomi
Looking at all the three reports, Dell Boomi has the best papers.

Dell Boomi serves sMBs and large enterprises with a unified, multipurpose integration platform to address multiple use cases, master data management (MDM), electronic data interchange (EDi), and APi management. Dell Boomi’s templates and crowdsourced data mapping suggestion capability enable it to support nonintegration specialists in a limited manner. Dell Boomi continues to innovate, introducing, for example, features like a new online community site and crowdsourced capabilities for support, suggestion, and error resolution. the vendor fits particularly well with sMBs looking for an all-purpose integration product to avoid investing in too many integration skills. large enterprises can also adopt Dell Boomi as a tactical choice to complete their soA strategic investments to cover cloud integration needs. some large companies that are unhappy with their heavy soA investments would benefit from turning to Dell Boomi as a strategic choice. Dell Boomi should guide integrators with governance and canonical formats to reduce the complexity of maintaining point-to-point interfaces.

  • It has a lot of application connectors right out of the box.
  • Web based IDE
  • Architected specifically for cloud-based delivery
  • Pricing start from 550,- dollars per month

Mule CloudHub
Mulesoft is also scoring well in the leader quadrant in two reports.

MuleSoft CloudHub has matured significantly since its introduction in February 2011, and is widely used by midsize-to-large enterprises for achieving cloud service integration. MuleSoft has executed an aggressive product roadmap and strategy to achieve impressive subscription growth over the last two-to-three-year period. MuleSoft CloudHub offers easy federation with Mule ESB, a lightweight and scalable ESB, to effectively support hybrid integration needs. In addition, MuleSoft Anypoint Platform for mobile enables API-led connectivity with backend applications and data sources, such as Salesforce.com, ServiceNow, SAP, and Siebel applications/platforms.

  • Also a lot of pre-build connectors
  • Platform lift and shift from on-premise to Amazon cloud
  • Anypoint eclipse based on-premise IDE
  • Pricing unknown

IBM WebSphere Cast Iron Cloud Integration
Also scoring strong in all 3 reports.

IBM WebSphere Cast Iron Cloud Integration is a relatively mature solution capable of supporting a range of integration needs, including cloud-to-cloud, on-premise-to-cloud, and mobile application integration. It offers easy connectivity to several other WebSphere middleware platforms to cater for key integration requirements, including B2B integration (via IBM Sterling Commerce suite) and API management. It can be used with IBM Mobile Foundation bundle to achieve connectivity between mobile applications developed on IBM Worklight and other on-premise and SaaS applications.

  • Very complete solution with an ESB, BPM, BAM, MFT
  • WebSphere Iron Cast Studio on-premise based IDE
  • Pricing unknown

Jitterbit
One of the the runner-ups…

The Jitterbit Harmony platform addresses multiple integration requirements, including data, process, hybrid, B2B, real-time APi management, and iot integrations via a single, comprehensive platform. Jitterbit is available through direct sales teams in north America, Europe, and Asia and includes a free 30-day trial available from the website. it has a differentiated partner program with more than 200 resellers, technology companies, and independent software vendors, including Autodesk, Microsoft, netsuite, salesforce, and SAP, to deliver prebuilt integration solutions and templates for specific business and industry processes that enable business users and technologists to quickly connect applications, data, and business processes across on-premises and cloud environments. Jitterbit’s single, multitenant cloud platform fits particularly well with companies that are strategically moving to the cloud but need to connect with on-premises systems and databases. Jitterbit has 12 years of integration expertise, has 40,000 freemium and paid customers, and is based in the san Francisco Bay area.

  • Limited market share but ambitious
  • Web-based IDE
  • Pricing start from 2000,- dollar a month

Using the Coherence Adapter in SOA Suite 12C

Retrieving data from a back-end system or executing something in a business rule engine can cost quite some time. When we build services we want to strife for performance but sometimes you make services which you just can’t make any faster. One thing to keep in mind is that it is possible to cache in SOA Suite using the coherence adapter. For example retrieving data from a back-end system in a composite and enriching it takes 1400ms average but during a 24 hours, you might get 15% call with the same request, which should give exactly the same answer! To do the retrieving and enrichment every time seems like a shame. This is where the Coherence adapter comes in.

The Coherence adapter can store certain pieces of data for a certain amount of time. This overview consists of 2 parts. The first one is the configuration of the coherence adapter in weblogic, the second part is the usage of the coherence adapter in a composite. Now lets get started.

Configuring the Coherence adapter

  1. The Coherence adapter isn’t active out of the the box. It is not targeted to any managed server so the first thing you have to do is to target it.

    Go to your SOA server’s console, go to Deployments and click the CoherenceAdapter. Go the the tab Targets. Look if it is targeted. If it is not….target it. Click Lock & Edit and check soa_cluster checkbox, click Save and Activate Changes: Target

  2. The next step is to add cache configuration file. This step requires that you have to physically put an XML on the file system of the server. In our example it is a file called ProductServiceCache-Configuration.xml and sat in location /u01/domains/dev_soa_domain. On a clustered environment the file have to be on both servers! My configuration file looks like:
    <?xml version="1.0"?>
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd">
    <cache-config>
      <caching-scheme-mapping>
        <cache-mapping>
          <cache-name>ProductServiceCache</cache-name>
          <scheme-name>transactional</scheme-name>
        </cache-mapping>
      </caching-scheme-mapping>
      <caching-schemes>
        <transactional-scheme>
          <scheme-name>transactional</scheme-name>
          <service-name>DistributedCache</service-name>
          <autostart>true</autostart>
        </transactional-scheme>
      </caching-schemes>
    </cache-config>
    

    It is wise to create separate caches for separate services in case you want to set different cache timers or in case of the occasional flush.

  3. The next step is creating a Coherence Adapter Connection Factory. Go to your SOA environment console, deployments and click on the Coherence Adapter. Go to the tab Outbound Connection Pools. Click New. Choose javax.resource.cci.ConnectionFactory. Input your JNDI name. I will have eis/Coherence/ProductService and click Finish. Now select the connection factory you just created and choose the Properties tab. Set the following values:ConnectionFactorySettings
  4. As a last step, you will have to update the CoherenceAdapter with the settings. You can do this by stopping en starting the CoherenceAdapter. Go to Deployments and click Lock & Edit. Select the checkbox of the CoherenceAdapter and click Update and the Finish.
    CoherenceAdapter

Oke now we have configured Coherence including a cache for the ProductService.

Making use of the Coherence adapter in a composite

To make use of the adapter in your BPEL, you will have to add 2 JCA connections.
Coherence-JCA

We are going to call them retrieveResult and writeResult. Lets start of with the retrieveResult. Drag the Coherence Adapter onto the right part of the canvas and follow the wizard:

RetrieveResult

On the step 4 when pressing Finish it will ask you No cache key specified. No worries as we will do this later on.

No we will create a writeResult JCA adapter:
WriteResult I set the Time To Live of the cache to 10.000 milliseconds for the test.

Now wire your BPEL to your 2 adapters like you would normally do. This would look like this then.

CompleteComposite

Now in your BPEL you want to do 2 things:

  • Check if something is in the cache based on your key, and if it is return the result.
  • If it is not in there, do your normal thing and write the result into the cache

The reading and writing can take place like normal call-outs as you can see. Just drag an Invoke action onto your BPEL and wire it the correct Partnerlink. Lets first do the retrieve. Invoke the retrieveResult and create input and output variables like you would normally do. For the unique key, go to the Properties tab and add property jca.coherence.Key. Here you define the unique key for your result. I just concatenated some string from the request.

retrieveCall

In your BPEL you can check if the retrieve call gave you some data back. If it did, you can skip all the normal processing and just return the result. If you ended up with nothing, you want to do normal processing and at the end, write the result to the cache. Use the same Property again and make sure you assign the expression or variable which you used to retrieve something.

retrieveCall

It is that easy. My composite will look something like this now:
final

Now just build and deploy and see if it works!

My first soap call takes about 1777 ms….and that is without a cache hit. Now lets call it again within 10 seconds. There we go….800 ms. Now let check the EM to see if we hit the actual cache:

EM

There we go!

Some considerations:

  • This is an easy way to make things faster. Do keep checking with Performance, Load and Stress tests what will happen to you CPU and Memory usage on big loads. The A-team have some good articles about this.
  • You might want to have the Time-To-Live different per environment. So for example 10 minutes on Dev but 1 hour on Test and 24 houres on Prod. You can easily do this making use of server tokens. Add them in the EM under tokens and use them in the JCA files like ${ProductServiceCacheTime} for example.

References:

  • http://www.ateam-oracle.com/using-the-12c-coherence-adapter-to-access-a-non-transactional-local-cache/
  • https://docs.oracle.com/middleware/1221/adapters/develop-soa-adapters/GUID-EA82A9A3-656E-464A-B764-E076C172BAEF.htm#TKADP2495

SOA Suite 12C: Add version information to your ServiceBus projects using custom maven plugin

As you have seen in my previous posts it is possible to build your SB and SOA components using Maven. See here. One of the issues we encountered is that building ServiceBus projects supports some very basic maven stuff. For example the description property in the maven pom file is not mapped to the description field of a service bus project which would have been nice as this is the only extra field which we can use to put some extra information….for example version info!

For SOA composites you are able to input some versioning info by updating the composite.xml using a Google maven plugin. Just add this plugin to build:

<!--Needed to replace the revision in the composite.xml due to bug (20553998) which causes not to update the revision correctly -->
<plugin>
	<groupId>com.google.code.maven-replacer-plugin</groupId>
	<artifactId>replacer</artifactId>
	<version>1.5.3</version>
	<executions>
		<execution>
			<phase>initialize</phase>
			<goals>
				<goal>replace</goal>
			</goals>                   
		</execution>
	</executions>
	<configuration>
		<ignoreMissingFile>true</ignoreMissingFile>
		<file>${scac.input}</file>
		<xpath>//composite/@revision</xpath>
		<token>^.*$</token>
		<value>${composite.revision}</value>
	</configuration>
</plugin>

which result into:
versioningSoa

For SB projects there is no such thing. How can we then see which version we have?! Well the only field which we can use is the description field. The only problem is that you can not update this using Maven. The things you have to:

  • Unzip the sbconfig.sbar
  • Update the _projectdata.LocationData file which holds a proj:description tag
  • Zip the sbconfig.sbar again

Not too difficult at all. You probably can do this by using Ant but that is so 2001! We have Maven now so why not write a custom Maven plugin which does this all for you?! Well I won’t bother you with the Java details but you can download the plugin jar here!

Just install it into your Maven repository by running:

    mvn install:install-file -Dfile=version-information-plugin-1.0.jar -DgroupId=nl.redrock.maven.plugins.servicebus -DartifactId=version-information-plugin -Dversion=1.0 -Dpackaging=jar

Now that you have the plugin installed you can wire it to the package phase of your service bus project by adding the plugin to your service bus pom file. Mine looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"
         xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <modelVersion>4.0.0</modelVersion>
  <parent>
    <groupId>com.oracle.servicebus</groupId>
    <artifactId>sbar-project-common</artifactId>
    <version>12.1.3-0-0</version>
  </parent>
  <groupId>nl.redrock</groupId>
  <artifactId>ConversionRateService</artifactId>
  <version>1.1.2.5</version>
  <packaging>sbar</packaging>
  <description></description>
  <build>
    <plugins>
     <plugin>
        <groupId>nl.redrock.maven.plugins.servicebus</groupId>
        <artifactId>version-information-plugin</artifactId>
        <version>1.0</version>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>version</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <description>${project.version}</description>
        </configuration>
      </plugin>
    </plugins>
  </build>
</project>

This will now call the plugin after the normal packaging has been completed. In the configuration you can now set the description. In my case I fill it with the versioning info of the component itself. Now just build your service bus project using Maven:

mavenbuild

and deploy and go to the sbconsole and voila:

sbconsole

Installing SOA BP 12.1.3.0.5 makes the flowinstance title disappear. Incoming workaround!

After installing bundle patch 12.1.3.0.5 we noticed that the flow instance title of our instances in SOA where flakey. Meaning sometimes they appeared, but sometimes they didn’t. After some testing and going back and forth with Oracle support, we where not able to steadily reproduce the bug. We did come to a work-around though and that is forcing a dehydrate. Not the best option is my opinion but a possible workaround. So if you have this issue, just add a dehydrate to the BPEL and magically see your flow instance titles come back to live again.

dehydrate

When this bug is going to be resolved is still unknown.

Continuous Discussions: Orchestrating Enterprise Software Development Testing

A while back I was asked by the people from Electric Cloud if I would be interested in participating in a panel to talk about Orchestrating Enterprise
Software Development as a part of their Continuous Discussions series.

c9d9-logo3
I thought it was a very cool idea and of course said yes. The questions we talked about where things like

  • What does your test matrix look like?
  • How do you define the pathway through your test cycle?
  • How do you manage test data and environments?
  • How have testing needs changed overtime?

We had quite a nice blend of people from the IT industry with different backgrounds so it was very interesting to hear others peoples stories, views and opinions! Have a look at the discussion right here:

Also see the blogpost of Electric Cloud here and the whole series here

Thanks Sam, Anders, Gilad and Avigail for having me!

For the people who can’t or don’t want to watch the video….here is a transcript of my contribution:

Continue reading

OFM 12C: Running WLST scripts in your build pipeline using the weblogic-maven-plugin

When you are building and deploying servicebus or soa composites to the server, you will have the certain dependencies with the server such as datasources or JMS resources. Those resources must be there if you want to deploy. A best practice is of course to script these. You can use WLST to run .py script which create your resources. The only problem is you want to know for sure the resources are there?! If you want to be consistent in your roll-outs on Dev, Tst, Acc and Prd, you all want to do this in the same manner. Operations usually do the roll-out on Acc and Prd but how do make sure we do this in the exact same way in Dev and Tst?

If you are using a build pipeline using Jenkins (see here and here ) you can easily add a step in there which can create the resources for you by running a script. How do we do this?

We are going to make use of the weblogic-maven-plugin. See here for the documentation. First make sure you have installed the plugin into you local repository. As the documentation says, do the following:

  • Change directory to ORACLE_HOME\oracle_common\plugins\maven\com\oracle\maven\oracle-maven-sync\12.1.3
  • mvn install:install-file -DpomFile=oracle-maven-sync-12.1.3.pom -Dfile=oracle-maven-sync-12.1.3.jar
  • mvn com.oracle.maven:oracle-maven-sync:push -Doracle-maven-sync.oracleHome=c:\oracle\middleware\oracle_home\

You can check if it was successful by running

mvn help:describe -DgroupId=com.oracle.weblogic -DartifactId=weblogic-maven-plugin -Dversion=12.1.3-0-0

This should list all 24 goals of the plugin.

Now for the simple test, I have created a simple script test.py which adds a queue to the SOAJMSModule

try:
        print('--> about to connect to weblogic')
	connect('USERNAME','PASSWORD','t3://localhost:7001')
	print('--> about to create a queue ' + "MyQueue")
	edit()
	startEdit()
	cd('/JMSSystemResources/SOAJMSModule/JMSResource/SOAJMSModule')
	cmo.createQueue("MyQueue")
	cd('/JMSSystemResources/SOAJMSModule/JMSResource/SOAJMSModule/Queues/' + "MyQueue")
	set('JNDIName', 'jms.myqueueu')
	set('SubDeploymentName', 'SOASubDeployment')
	save()
	print('--> activating changes')
	activate()
	print('--> done')
except:
  print('Failed connecting to server')

I created a simple .pom file which only defines the build plugin:

<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd" xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

	<modelVersion>4.0.0</modelVersion>
	<groupId>nl.redrock</groupId>
	<artifactId>ResourceService</artifactId>
	<version>1.0</version>
	<packaging>jar</packaging>
	<description/>

	<build>
		<plugins>
			<plugin>
				<!-- This is the configuration for the weblogic-maven-plugin -->
				<groupId>com.oracle.weblogic</groupId>
				<artifactId>weblogic-maven-plugin</artifactId>
				<version>12.1.3-0-0</version>
				<configuration>
					<middlewareHome>/fmwhome/wls12130</middlewareHome>
				</configuration>
				<executions>
					<!-- Execute the appc goal during the package phase -->
					<execution>
						<id>wls-wlst-server</id>
						<phase>post-integration-test</phase>
						<goals>
							<goal>wlst-client</goal>
						</goals>
						<configuration>
							<executeScriptBeforeFile>true</executeScriptBeforeFile>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build> 
</project>

Now lets see if we can run the script by running the following:
mvn com.oracle.weblogic:weblogic-maven-plugin:wlst-client -DfileName=test.py

As you can see, the script was ran successfully:

Run

and to be sure we can check in the console:
console

And if you incorporate it into your Jenkins build pipeline, it could look something like this:

jenkins

So this is an easy way to consistently run scripts through-out your DTAP.

Some considerations though:

  • As you are running the script every time you are building a component, make sure the script takes in account that the resources might already have been created
  • The creation of some resources require a server restart. You can also restart servers using the weblogic plugin but I’m not yet sure what a good way of working is here