Building a simple microservice using Spring Boot

In this short post, I will show how to build a simple JPA Microservice using Spring Boot. Spring Boot makes it easy to create stand-alone based applications that you can run and need very little Spring configuration as we will see in this short tutorial.

springboot

For an explanation about microservices, read this article of Martin Fowler.

As I was saying we are going to use Spring Boot. First start of with a simple java maven enabled project in the IDE of your choice. We will first start with the .pom file to get all the Spring dependencies right. I am going to build a simple ReferenceDataService microservice which can deliver some simple reference data such as a list of countries. Let take a look at the pom file:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <artifactId>ReferenceDataService</artifactId>
    <groupId>nl.redrock</groupId>
    <packaging>jar</packaging>
    <name>ReferenceData Microservice</name>
    <description>ReferenceData Service</description>
    <version>1.0</version>
    
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>1.5.2.RELEASE</version>
    </parent>
    
	
    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-jpa</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <!-- Oracle JDBC driver -->
        <dependency>
            <groupId>com.oracle</groupId>
            <artifactId>ojdbc7</artifactId>
            <version>7.0</version>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>
</project>

As you can see, we the 1.5.2.RELEASE as a basis. We then have 4 dependencies. We need spring-boot-starter-web for libraries to creating the rest service, we need spring-boot-starter-data-jpa for the jpa capability, we need spring-boot-starter-test for testing capabilities and last an ojdbc.jar because the data is located in a Oracle database. I am also adding the spring-boot-maven-plugin to be able to run it from maven using Tomcat.

Let start of with the entry point….the application. This is the starting point of our simple service. It will look like this:

package nl.redrock.referencedataservice;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class ReferenceDataApplication {

    public static void main(String[] args) {
        SpringApplication.run(ReferenceDataApplication.class, args);
    }
}

As you can see, it is very simple. Use the @SpringBootApplication annotation to tag it as a Spring Boot application and that is it. Next we want a class which can retrieve data from database. Spring has a bunch helpful classes and templates for you to help with this. I already have a pre-filled oracle database running with 1 table called country. It has 3 columns….ID, CODE and NAME.

database

So as a first step we will create a Country class which maps to the table.

package nl.redrock.referencedataservice.data;

import javax.persistence.Entity;
import javax.persistence.Id;

@Entity
public class Country {
    
    @Id
    private int id;
    private String code;
    private String name;

    /**
     * @return the id
     */
    public int getId() {
        return id;
    }

    /**
     * @param id the id to set
     */
    public void setId(int id) {
        this.id = id;
    }

    /**
     * @return the code
     */
    public String getCode() {
        return code;
    }

    /**
     * @param code the code to set
     */
    public void setCode(String code) {
        this.code = code;
    }

    /**
     * @return the name
     */
    public String getName() {
        return name;
    }

    /**
     * @param name the name to set
     */
    public void setName(String name) {
        this.name = name;
    }
    
    @Override
    public String toString() {
        return String.format("Country[id=%d, code='%s', name='%s']",
                id, code, name);
    }
}

As you can see this is also a very simple POJO with the 3 attributes mapping to the 3 columns with getter and setters. The annotation which does al the magic here is @Entity. This tells spring that it can be used for Object Relational Mapping. Next we we will create a class which will fetch the data.

package nl.redrock.referencedataservice.repository;

import nl.redrock.referencedataservice.data.Country;
import org.springframework.data.repository.CrudRepository;

public interface CountryRepository extends CrudRepository<Country, Long> {

    Country findById(int id);
}

And again….not much coding here. Just a simple interface which extends Spring’s CRUDRepository. We define just 1 extra interface for retrieving a country by its id. And basically that is it. We just have 1 thing left to do, and that is telling Spring which database to connect to. This is easily done by adding an application.properties to you classpath with all the settings in so it is automatically picked up by Spring.

#port to run apache on
server.port=8888

# Oracle settings
spring.datasource.url=jdbc:oracle:thin:@private.eu-west-1.compute.amazonaws.com:1521:xe
spring.datasource.username=SECRET
spring.datasource.password=SECRET
spring.datasource.driver-class-oracle.jdbc.driver.OracleDriver

#set sql level to debug to see all the sql statements
logging.level.org.hibernate.SQL=debug
logging.pattern.console=%d{yyyy-MM-dd HH:mm:ss} %-5level %logger{36} - %msg%n

The server.port is a setting which you can use to adjust the port tomcat runs on. Next up are the oracle database connection settings. And last of all some logging tweaking. A last thing to do is to make a unit test to see if it al works.

package nl.redrock.referencedataservice.repository;

import junit.framework.TestCase;
import nl.redrock.referencedataservice.data.Country;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.orm.jpa.AutoConfigureTestDatabase;
import org.springframework.boot.test.autoconfigure.orm.jpa.AutoConfigureTestDatabase.Replace;
import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest;
import org.springframework.test.context.junit4.SpringRunner;

@RunWith(SpringRunner.class)
@DataJpaTest
@AutoConfigureTestDatabase(replace=Replace.NONE)
public class CountryRepositoryTest extends TestCase {
    
    @Autowired
    CountryRepository countryRepository;
    
    @Test
    public void testCountryRepository(){
        Country c = this.countryRepository.findById(1);
        assertTrue(c != null);
        assertTrue(c.getCode().equals("ac"));
        assertTrue(c.getName().equals("Ascension Island"));
    }
}

The things to look for here are the @DataJpaTest annotation which tells Spring it is a JPA test. The @AutoConfigureTestDatabase(replace=Replace.NONE) annotation tells Spring to not replace the application default DataSource.

Run the test:

UT

As you can see, we can fetch data from the database with minimal coding.

Now for the service part. Spring also has easy ways to accommodate this using the @RestController annotation.

package nl.redrock.referencedataservice.controller;

import java.util.ArrayList;
import java.util.List;
import java.util.logging.Level;
import java.util.logging.Logger;
import nl.redrock.referencedataservice.data.Country;
import nl.redrock.referencedataservice.repository.CountryRepository;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping("/referencedataservice/countries")
public class ReferenceDataController {
    
    private final static Logger LOGGER = Logger.getLogger(ReferenceDataController.class.getName());
    
    @Autowired
    CountryRepository countryRepository;


    @RequestMapping("/{id}")
    public Country getCountry(@PathVariable int id) {
        Country result;
        LOGGER.log(Level.INFO, "Getting country with id " + id);
        result = countryRepository.findById(id);
        return result;
    }

    @RequestMapping(method = RequestMethod.GET)
    List<Country> getCountries() {
        List<Country> result;
        LOGGER.log(Level.INFO, "Getting all countries");
        result = new ArrayList();
        Iterable<Country> countryList = countryRepository.findAll();
        for (Country country : countryList) {
            result.add(country);
        }
        return result;
    }
}

As you can see we implement 2 operations. 1 to get all the countries and 1 to get a specific country by its id. We use @Autowired to inject the countryRepository. And now for the proof of the pudding. Run mvn clean spring-boot:run and watch maven spin up a Tomcat instance with the referencedataservice application deployed on it. Open up the browser and call:

http://localhost:8888/referencedataservice/countries

countriesResult

Now call http://localhost:8888/referencedataservice/countries/160 to get a specific country

countryResult

As you can see Spring makes it very easy to create rest services with minimal coding. If you want to look into some more advanced microservice features Spring has to offer, have a look here and here to see how you can use microservices in conjunction with Netflixs Eureka server.

SOA Suite 12C: Generating a JSON Web Token (JWT) in OSB

“JSON Web Token (JWT) is a JSON-based open standard (RFC 7519) for creating access tokens that assert some number of claims. For example, a server could generate a token that has the claim “logged in as admin” and provide that to a client. The client could then use that token to prove that it is logged in as admin. The tokens are signed by the server’s key, so the client is able to verify that the token is legitimate. The tokens are designed to be compact, URL-safe and usable especially in web browser single sign-on (SSO) context. JWT claims can be typically used to pass identity of authenticated users between an identity provider and a service provider, or any other type of claims as required by business processes. The tokens can also be authenticated and encrypted.” Wikipedia

jwt

I guess the above text says it all. I won’t be going into all the JWT details as you can read most of it online on JWT.io. For my current customer I had to interface with a SaaS product which used JWT as authentication measure. There are quite some libraries for Java which you can use to generate tokens. To mention a few:

  • com.auth0.java-jwt
  • org.bitbucket.b_c.jose4j
  • com.nimbusdsnimbus.jose-jwt
  • io.jsonwebtoken.jjwt

Easy as I thought, I grabbed the most obvious one I could find and created a simple class which could create a token for me. Sounded fairly easily but the used jar also had it’s dependencies which of course had to be available in Weblogic also for it to work. This is where the problem started as the dependencies seem to cause some class loading issues. I tried another JWT implementation but this also seem to cause some issues. After some googling I found that Oracle also had JWT support looking at the page. The only thing I really couldn’t find very quickly was which jars I needed and where they where….

After some poking around in my Oracle_Home I found the right jars though. The ones you need to create a token are:

  • osdt_cert.jar
  • osdt_core.jar
  • osdt_restsec.jar
  • jackson-core-asl-1.1.1.jar
  • jackson-mapper-asl-1.1.1.jar

The first 3 are located in your OracleHome\oracle_common\modules\oracle.osdt_12.1.3 folder. The 2 jackson ones you can just grab of the internet
These jars contain all the classes you need to create a token and verify it in your Java development environment. The first three are already on your Weblogic classpath. The two Jackson also seem to be somewhere on the server although the documentation makes you think otherwise but I didn’t need to put the 2 Jackson libraries somewhere on the server to make it work.

To create a token you can just use the following code for example:

package nl.redrock.jwt;

import oracle.security.restsec.jwt.*;

public class JWTGenerator {
    
    
    public static String generateJWT(String aCode, String aAmount, String aKey) throws Exception {
        
        String result = null;
        
        JwtToken jwtToken = new JwtToken();
        //Fill in all the parameters- algorithm, issuer, expiry time, other claims etc
        jwtToken.setAlgorithm(JwtToken.SIGN_ALGORITHM.HS512.toString());
        jwtToken.setType(JwtToken.JWT);
        jwtToken.setClaimParameter("Amount", aAmount);
        jwtToken.setClaimParameter("Code", aCode);
        // Get the private key and sign the token with a secret key or a private key
        result = jwtToken.signAndSerialize(aKey.getBytes());
        return result;
    }
}

To call the generateJWT class, just create a custom xquery lib file…in my case custom-redrock-xquery.xml which looks like this:

<?xml version="1.0" encoding="UTF-8" ?>
<xpf:xpathFunctions xmlns:xpf="http://www.bea.com/wli/sb/xpath/config">
	<xpf:category id="RedRock Custom Functions">
		<xpf:function>
			<xpf:name>generateJWT</xpf:name>
			<xpf:comment>Generate a JSON Web Token based on inputs</xpf:comment>
			<xpf:namespaceURI>http://www.redrock.nl/soa/xpath</xpf:namespaceURI>
			<xpf:className>nl.redrock.jwt.JWTGenerator</xpf:className>
			<xpf:method>java.lang.String generateJWT(java.lang.String, java.lang.String, java.lang.String)</xpf:method>
			<xpf:isDeterministic>false</xpf:isDeterministic>
			<xpf:scope>Pipeline</xpf:scope>
			<xpf:scope>SplitJoin</xpf:scope>
		</xpf:function>
	</xpf:category>
</xpf:xpathFunctions>

Stick this file under OracleHome\osb\config\xpath-functions along with the jar-file which contains your utility class and dont’t forget the osdt_restsec.jar as it is a dependency which seems to be needed at deploytime. Then start JDev up and in XQuery Expression builder popup you should now see the custom xquery.

xqueryexpressionbuilder

Now just deploy you project to the servicebus and run a test. If you enable execution tracing you will see that the assign works and we get a token.

token

Next select the token and go to JWT.io and check if it verifies:

jwtcheck

A first glance at Mule’s API capabilities

Since 2006, Mulesoft is offering middleware and messaging. Back then, building integrations was still heavily XML file based, with a set of Eclipse based plugins. This has changed though! With their Anypoint Studio, an Eclipse-based graphical development environment for designing, testing and running Mule flows, they really upped their game. Together with their new IPaaS platform CloudHub and their API capabilities, Mulesoft is looked at as a big player in IPaaS and API arena as shown in a previous article here where both Gartner and Ovum rate them as leaders. To add to that, Mulesoft was placed in the Forbes Cloud top 100 at number 20.

mulesoft

As I was quite curious how the new studio compared to my daily integration work with JDeveloper, I downloaded it and gave it a swing with a simple integration. You can download Anypoint Studio right here.

I started of by creating a simple mule project. File > New > Mule Project and clicking finish. As you can see, you can also use Maven but I will leave this unchecked for now.

MuleProject

After clicking finish you will see that it generates a project structure on the left side:

MuleProjectExplorer

In the middle is the canvas on which you can drag and drop items from the palette which is on the right. Straight away you can see there are quite a few connector which come out-of-the-box.

MuleProjectConnectors

I will start of with a HTTP connector. You can search in palette by typing HTTP. Now just drag the component onto the canvas. When you select the component, you can see the properties at the bottom. I will just keep the HTTP Display Name but I will set the Path to /simplemuleservice. Then I will need to generate a listener configuration by clicking the + sign on the right. I will accept all the default values which are 0.0.0.0 at port 8081. Next I am going to add a simple response. Look for Set Payload in the palette and drag it in the box next to the HTTP connector. Now select it and input a value. Mine is Hello…this is a simple Mule service.

setpayload

Next lets add some Logging. Add the Logger component and add it to the canvas. In the Message property of the Logger we can state what we want to log. The settings for the logging are defined in the log4j2.xml under resources. You can also use the Mule expression language here to access all sorts of info. I will just log all of the parameters send with the request by adding the next message: My simple mule service called with parameter #[message.inboundProperties.'http.query.params']

logger

Now save the whole project. Lets see if it also runs. Just right click the application a choose > Run as > Mule application.

You can see the application is running.

running

Now open your browser and type in server and port you defined. In my case http://localhost:8081/simplemuleservice. As I also added the parameters to the Logger step, I will also add some of those like this: http://localhost:8081/simplemuleservice?caller=Hugo&message=Hi

As you will see the browser will respond with the output we defined.

browser

And the logging shows:
loggerOutput

As you can see, creating a flow is much easier then before. Underneath it is still al XML but the visual wrapper around it makes it easier to work with in my opinion. This was actually a very simple flow. The next step is to do something which is a bit more work but more like an actual usecase.

Lets say we have a SOAP service running on-premise. We want to expose that service as an API to the outside world. So basically 2 things:

  • Design the API first
  • Convert REST to SOAP and call the webservice
  • Convert the SOAP response back to a JSON result and return it

I want to make an API for a webservice which I have used before…..the ConversionRateService. It has 2 input values. FromCurrency and ToCurrency. The result is ConversionRate result. So let’s start with creating a simple RAML file. Mine looks like this:

#%RAML 1.0
title: Conversionrate API
version: v1.0
baseUri: http://conversionrate/api
mediaType: application/json
documentation:
  - title: Introduction
    content: |
      API to lookup currency conversion rates
/convert:
  get:
    responses:
      200:
        body:
          example: |
            {
                "conversionrate" : "1.2345",
                
            }

Now that we have our RAML file, start a new project, in my case simplemule2. Don’t forget to tick the Add APIkit components box below and select the RAML file we just created.

newProject

When you click finish, Anypoint Studio will generate skeletal backend flows, based on the RAML file. The next step is to delete the Set Payload in the get:/convert:conversionrate-config. This is the flow in which we will call our webservice. Drag the Web Service Consumer component onto the canvas. Then down below we can create a Connector Configuration.

webserviceconsumer. I used the WSDL from an old project here.

Now that I have the call to the webservice, I have to map the rest call to soap. Drag a Transform Message before the Web Service Consumer component. If you look at the properties below, you can see on the left side the Input, and on the right side the output. As the input parameters will be incoming as http query parameters, we can just type in the mapper file on the right, how we want to map them.

mapRequest

Next we want to transform the result from SOAP to a JSON response so drag another Transform Message but this time place it after the Web Service Consumer component. Again look at the properties below. On the left side you can the the data model of the webservice response. The right side Output shows unknown. As I just want to map the response, you can edit the result on the right side like shown below:

mapResponse

As you can see, we will return a simple JSON response in which we map the conversionrate field to the webservice response field ConversionRateResult. Save it and we are ready to test the application. To make it easy for myself, I have created the conversionrate webservice using SoapUI by importing the WSDL and creating a Mock service. It is running on http://localhost:8005/mockConversionRateService as defined in the Web Service Consumer component.

Now right click the application and choose Run as–> Mule application. As you can see it will also start the APIkit console automatically. It shows the description of the API and an interface to make calls. I will just use a good old browser call like this: http://localhost:8081/api/convert?FromCurrency=EUR&ToCurrency=USD which result in

mockResponse

As you can see, it hits the SoapUI mock which returns a SOAP response which is processed by Mule which returns a JSON response.

restResponse

I was surprised how easy it was to create an API design first. The UI works nice and easy and everything is deployed nice and fast on the integrated Mule server. I have just scratched the surface here of Mule’s capabilities but I am liking what I am seeing.

SOA Suite 12C : Patch to fix the JDev business rule component performance

A while back I did a post about the business rule engine. See here. In the end of the post I added some considerations. One was the very annoying performance of the business rule component in JDev when using decision tables. If you were editing them, the memory usage started growing and after 5 minutes JDev would crash. Also editing large decision tables was very slow. The worst part back then was that the export and import to excel was also broken.

performance

A while back there was a fix for that but recently we found out that another JDev ADF patch major increases the usability of decision tables.

We applied the next 3 patches but the last one really did the job.

  • Performance Patch (23754944)
  • Bundle Patch 161018 (20163149)
  • JDev ADF Security Patch (23754311)

Don’t forget to start JDev with the -clean parameter once to get a fresh start.

So next time when you start pulling out your hairs when editing a large decision table…..think OPatch 23754311!

Integration Platforms as a Service in 2016.

Working in the integration business, you probably noticed quite an increasingly upcoming market of data sources which are no longer on premise. With more and more PaaS and SaaS solutions, the need to integrate them becomes obvious. Salesforce should talk to your on-premise CRM system and your custom applications should update your Twitter feed for example. Back in the day, you would get a piece of software usually called something with Gateway in the name. It would be able connect your internal enterprise software to the evil and scary outside world and usually do something extra like authentication, authorisation and monitoring. But integrating with Salesforce is different then integrating with SAP so you have to build quite a lot of custom software which is expensive. Also you had configure every new connection to the outside world as you don’t want unwanted people fiddling around with your internal services.

How handy would it be, if there was a platform which was located in the cloud and which was able to connect to SaaS vendors out of the box? Then you just have to make 1 connection from your company network to the platform and every data stream could be configured through there.
Well ladies and gentlemen, meet iPaaS aka Integration Platform as a Service.

iPaaS

At this moment there are quite some vendors which offer an iPaaS solution. You have the obvious big boys such as Oracle, Microsoft and IBM but also some lesser software behemoths which according to Forrester, Gartner and Ovum are taking the cake. Lets look at the reports for 2016.

reports

In the left graph, you can see the Forrester Wave.
In the middle graph you see the Gartner Magic quadrant and in the right one you see the Ovum decision matrix

As an Integration consultant working with Oracle software I was surprised Oracle wasn’t present in 2 of the 3 reports. Oracle’s Integration Cloud Service was identified by Gartner as a visionary but they aren’t in the magic quadrant as they seem to lack ability to execute at the moment.

Dell Boomi
Looking at all the three reports, Dell Boomi has the best papers.

Dell Boomi serves sMBs and large enterprises with a unified, multipurpose integration platform to address multiple use cases, master data management (MDM), electronic data interchange (EDi), and APi management. Dell Boomi’s templates and crowdsourced data mapping suggestion capability enable it to support nonintegration specialists in a limited manner. Dell Boomi continues to innovate, introducing, for example, features like a new online community site and crowdsourced capabilities for support, suggestion, and error resolution. the vendor fits particularly well with sMBs looking for an all-purpose integration product to avoid investing in too many integration skills. large enterprises can also adopt Dell Boomi as a tactical choice to complete their soA strategic investments to cover cloud integration needs. some large companies that are unhappy with their heavy soA investments would benefit from turning to Dell Boomi as a strategic choice. Dell Boomi should guide integrators with governance and canonical formats to reduce the complexity of maintaining point-to-point interfaces.

  • It has a lot of application connectors right out of the box.
  • Web based IDE
  • Architected specifically for cloud-based delivery
  • Pricing start from 550,- dollars per month

Mule CloudHub
Mulesoft is also scoring well in the leader quadrant in two reports.

MuleSoft CloudHub has matured significantly since its introduction in February 2011, and is widely used by midsize-to-large enterprises for achieving cloud service integration. MuleSoft has executed an aggressive product roadmap and strategy to achieve impressive subscription growth over the last two-to-three-year period. MuleSoft CloudHub offers easy federation with Mule ESB, a lightweight and scalable ESB, to effectively support hybrid integration needs. In addition, MuleSoft Anypoint Platform for mobile enables API-led connectivity with backend applications and data sources, such as Salesforce.com, ServiceNow, SAP, and Siebel applications/platforms.

  • Also a lot of pre-build connectors
  • Platform lift and shift from on-premise to Amazon cloud
  • Anypoint eclipse based on-premise IDE
  • Pricing unknown

IBM WebSphere Cast Iron Cloud Integration
Also scoring strong in all 3 reports.

IBM WebSphere Cast Iron Cloud Integration is a relatively mature solution capable of supporting a range of integration needs, including cloud-to-cloud, on-premise-to-cloud, and mobile application integration. It offers easy connectivity to several other WebSphere middleware platforms to cater for key integration requirements, including B2B integration (via IBM Sterling Commerce suite) and API management. It can be used with IBM Mobile Foundation bundle to achieve connectivity between mobile applications developed on IBM Worklight and other on-premise and SaaS applications.

  • Very complete solution with an ESB, BPM, BAM, MFT
  • WebSphere Iron Cast Studio on-premise based IDE
  • Pricing unknown

Jitterbit
One of the the runner-ups…

The Jitterbit Harmony platform addresses multiple integration requirements, including data, process, hybrid, B2B, real-time APi management, and iot integrations via a single, comprehensive platform. Jitterbit is available through direct sales teams in north America, Europe, and Asia and includes a free 30-day trial available from the website. it has a differentiated partner program with more than 200 resellers, technology companies, and independent software vendors, including Autodesk, Microsoft, netsuite, salesforce, and SAP, to deliver prebuilt integration solutions and templates for specific business and industry processes that enable business users and technologists to quickly connect applications, data, and business processes across on-premises and cloud environments. Jitterbit’s single, multitenant cloud platform fits particularly well with companies that are strategically moving to the cloud but need to connect with on-premises systems and databases. Jitterbit has 12 years of integration expertise, has 40,000 freemium and paid customers, and is based in the san Francisco Bay area.

  • Limited market share but ambitious
  • Web-based IDE
  • Pricing start from 2000,- dollar a month

Using the Coherence Adapter in SOA Suite 12C

Retrieving data from a back-end system or executing something in a business rule engine can cost quite some time. When we build services we want to strife for performance but sometimes you make services which you just can’t make any faster. One thing to keep in mind is that it is possible to cache in SOA Suite using the coherence adapter. For example retrieving data from a back-end system in a composite and enriching it takes 1400ms average but during a 24 hours, you might get 15% call with the same request, which should give exactly the same answer! To do the retrieving and enrichment every time seems like a shame. This is where the Coherence adapter comes in.

The Coherence adapter can store certain pieces of data for a certain amount of time. This overview consists of 2 parts. The first one is the configuration of the coherence adapter in weblogic, the second part is the usage of the coherence adapter in a composite. Now lets get started.

Configuring the Coherence adapter

  1. The Coherence adapter isn’t active out of the the box. It is not targeted to any managed server so the first thing you have to do is to target it.

    Go to your SOA server’s console, go to Deployments and click the CoherenceAdapter. Go the the tab Targets. Look if it is targeted. If it is not….target it. Click Lock & Edit and check soa_cluster checkbox, click Save and Activate Changes: Target

  2. The next step is to add cache configuration file. This step requires that you have to physically put an XML on the file system of the server. In our example it is a file called ProductServiceCache-Configuration.xml and sat in location /u01/domains/dev_soa_domain. On a clustered environment the file have to be on both servers! My configuration file looks like:
    <?xml version="1.0"?>
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd">
    <cache-config>
      <caching-scheme-mapping>
        <cache-mapping>
          <cache-name>ProductServiceCache</cache-name>
          <scheme-name>transactional</scheme-name>
        </cache-mapping>
      </caching-scheme-mapping>
      <caching-schemes>
        <transactional-scheme>
          <scheme-name>transactional</scheme-name>
          <service-name>DistributedCache</service-name>
          <autostart>true</autostart>
        </transactional-scheme>
      </caching-schemes>
    </cache-config>
    

    It is wise to create separate caches for separate services in case you want to set different cache timers or in case of the occasional flush.

  3. The next step is creating a Coherence Adapter Connection Factory. Go to your SOA environment console, deployments and click on the Coherence Adapter. Go to the tab Outbound Connection Pools. Click New. Choose javax.resource.cci.ConnectionFactory. Input your JNDI name. I will have eis/Coherence/ProductService and click Finish. Now select the connection factory you just created and choose the Properties tab. Set the following values:ConnectionFactorySettings
  4. As a last step, you will have to update the CoherenceAdapter with the settings. You can do this by stopping en starting the CoherenceAdapter. Go to Deployments and click Lock & Edit. Select the checkbox of the CoherenceAdapter and click Update and the Finish.
    CoherenceAdapter

Oke now we have configured Coherence including a cache for the ProductService.

Making use of the Coherence adapter in a composite

To make use of the adapter in your BPEL, you will have to add 2 JCA connections.
Coherence-JCA

We are going to call them retrieveResult and writeResult. Lets start of with the retrieveResult. Drag the Coherence Adapter onto the right part of the canvas and follow the wizard:

RetrieveResult

On the step 4 when pressing Finish it will ask you No cache key specified. No worries as we will do this later on.

No we will create a writeResult JCA adapter:
WriteResult I set the Time To Live of the cache to 10.000 milliseconds for the test.

Now wire your BPEL to your 2 adapters like you would normally do. This would look like this then.

CompleteComposite

Now in your BPEL you want to do 2 things:

  • Check if something is in the cache based on your key, and if it is return the result.
  • If it is not in there, do your normal thing and write the result into the cache

The reading and writing can take place like normal call-outs as you can see. Just drag an Invoke action onto your BPEL and wire it the correct Partnerlink. Lets first do the retrieve. Invoke the retrieveResult and create input and output variables like you would normally do. For the unique key, go to the Properties tab and add property jca.coherence.Key. Here you define the unique key for your result. I just concatenated some string from the request.

retrieveCall

In your BPEL you can check if the retrieve call gave you some data back. If it did, you can skip all the normal processing and just return the result. If you ended up with nothing, you want to do normal processing and at the end, write the result to the cache. Use the same Property again and make sure you assign the expression or variable which you used to retrieve something.

retrieveCall

It is that easy. My composite will look something like this now:
final

Now just build and deploy and see if it works!

My first soap call takes about 1777 ms….and that is without a cache hit. Now lets call it again within 10 seconds. There we go….800 ms. Now let check the EM to see if we hit the actual cache:

EM

There we go!

Some considerations:

  • This is an easy way to make things faster. Do keep checking with Performance, Load and Stress tests what will happen to you CPU and Memory usage on big loads. The A-team have some good articles about this.
  • You might want to have the Time-To-Live different per environment. So for example 10 minutes on Dev but 1 hour on Test and 24 houres on Prod. You can easily do this making use of server tokens. Add them in the EM under tokens and use them in the JCA files like ${ProductServiceCacheTime} for example.

References:

  • http://www.ateam-oracle.com/using-the-12c-coherence-adapter-to-access-a-non-transactional-local-cache/
  • https://docs.oracle.com/middleware/1221/adapters/develop-soa-adapters/GUID-EA82A9A3-656E-464A-B764-E076C172BAEF.htm#TKADP2495

SOA Suite 12C: Add version information to your ServiceBus projects using custom maven plugin

As you have seen in my previous posts it is possible to build your SB and SOA components using Maven. See here. One of the issues we encountered is that building ServiceBus projects supports some very basic maven stuff. For example the description property in the maven pom file is not mapped to the description field of a service bus project which would have been nice as this is the only extra field which we can use to put some extra information….for example version info!

For SOA composites you are able to input some versioning info by updating the composite.xml using a Google maven plugin. Just add this plugin to build:

<!--Needed to replace the revision in the composite.xml due to bug (20553998) which causes not to update the revision correctly -->
<plugin>
	<groupId>com.google.code.maven-replacer-plugin</groupId>
	<artifactId>replacer</artifactId>
	<version>1.5.3</version>
	<executions>
		<execution>
			<phase>initialize</phase>
			<goals>
				<goal>replace</goal>
			</goals>                   
		</execution>
	</executions>
	<configuration>
		<ignoreMissingFile>true</ignoreMissingFile>
		<file>${scac.input}</file>
		<xpath>//composite/@revision</xpath>
		<token>^.*$</token>
		<value>${composite.revision}</value>
	</configuration>
</plugin>

which result into:
versioningSoa

For SB projects there is no such thing. How can we then see which version we have?! Well the only field which we can use is the description field. The only problem is that you can not update this using Maven. The things you have to:

  • Unzip the sbconfig.sbar
  • Update the _projectdata.LocationData file which holds a proj:description tag
  • Zip the sbconfig.sbar again

Not too difficult at all. You probably can do this by using Ant but that is so 2001! We have Maven now so why not write a custom Maven plugin which does this all for you?! Well I won’t bother you with the Java details but you can download the plugin jar here!

Just install it into your Maven repository by running:

    mvn install:install-file -Dfile=version-information-plugin-1.0.jar -DgroupId=nl.redrock.maven.plugins.servicebus -DartifactId=version-information-plugin -Dversion=1.0 -Dpackaging=jar

Now that you have the plugin installed you can wire it to the package phase of your service bus project by adding the plugin to your service bus pom file. Mine looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"
         xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <modelVersion>4.0.0</modelVersion>
  <parent>
    <groupId>com.oracle.servicebus</groupId>
    <artifactId>sbar-project-common</artifactId>
    <version>12.1.3-0-0</version>
  </parent>
  <groupId>nl.redrock</groupId>
  <artifactId>ConversionRateService</artifactId>
  <version>1.1.2.5</version>
  <packaging>sbar</packaging>
  <description></description>
  <build>
    <plugins>
     <plugin>
        <groupId>nl.redrock.maven.plugins.servicebus</groupId>
        <artifactId>version-information-plugin</artifactId>
        <version>1.0</version>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>version</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <description>${project.version}</description>
        </configuration>
      </plugin>
    </plugins>
  </build>
</project>

This will now call the plugin after the normal packaging has been completed. In the configuration you can now set the description. In my case I fill it with the versioning info of the component itself. Now just build your service bus project using Maven:

mavenbuild

and deploy and go to the sbconsole and voila:

sbconsole

Installing SOA BP 12.1.3.0.5 makes the flowinstance title disappear. Incoming workaround!

After installing bundle patch 12.1.3.0.5 we noticed that the flow instance title of our instances in SOA where flakey. Meaning sometimes they appeared, but sometimes they didn’t. After some testing and going back and forth with Oracle support, we where not able to steadily reproduce the bug. We did come to a work-around though and that is forcing a dehydrate. Not the best option is my opinion but a possible workaround. So if you have this issue, just add a dehydrate to the BPEL and magically see your flow instance titles come back to live again.

dehydrate

When this bug is going to be resolved is still unknown.