Lessons from API integration

A general transition has been happening with the nature of work that I do in the integration space … we have been doing less of SOAP/XML/RPC webservices and more RESTful APIs for “digital enablement” of enterprise services .  This brought a paradigm shift and valuable lessons were learnt (rightly or wrongly) … and of course the process of learning and comparing never stops!

Below are my observations …

It is not about SOAP vs REST … it is about products that we integrate, the processes we use and the stories we tell as we use one architectural style vs the other

  1. Goals: Both architectural styles work to achieve the same goal – integration. The key differences lie in where they are used. SOAP/XML has seen some b2b but adoption by web/mobile clients is low
  2. Scale: SOAP/XML is good but will always remain enterprise scale … REST/JSON is webscale. What do I mean by that? REST over HTTP with JSON feels easier to communicate, understand and implement.
  3. Protocol: The HTTP protocol is used by REST better than SOAP, so the former wins in making it easy to adopt your service
  4. Products: The big monolith integration product vendors are now selling “API Gateways” (similar to a “SOA product suite”) … The gateway should be a lightweight policy layer, IMHO, and traditional vendors like to sell “app server” licenses which will blow up an API gateway product (buyer beware!)
  5. Contract: RAML/YAML is a lot easier to read than a WSDL ….which is why contract first works better when doing REST/JSON
  6. Process: The biggest paradigm change we have experienced is doing Contract Driven Development  ny writing API definition in RAML / YAML … Compare this to generating wsdl or RAML from services. Contract driven , as I am learning, is much more collaborative!
  7. Schemas: XML schemas were great until they started describing restrictions … REST with JSON schema is good but may be repeating similar problems at Web scale
  8. Security: I have used OAuth for APIs and watched them enable token authentication and authorization easily with clients outside the enterprise – we are replacing traditional web session ids with oauth tokens to enable faster integration through 3rs party clients … all this through simple configuration in an API gateway compared to some of the struggle with SAML setup within an organisation with the heavy monolithic SOA and security products!

It would be easy to then conclude that we ought to rip out all our existing SOAP/XML integrations and replace them with REST APIs no? Well not quite…as always “horses for the courses”.

Enterprise grade integration may require features currently missing in REST/JSON (WS-* , RPC) and legacy systems may not be equipped to do non XML integration

My goal was to show you my experience in integrating systems using API with Contract driven development & gateways with policies vs traditional web service development using SOA products …hope to hear about your experience, how do you like APIs?

Camunda BPM – Manual Retry

The Camunda BPM is a lightweight, opensource BPM platform (See here: https://camunda.com/bpm/features).

The “Cockpit” application within Camunda is the admin dashboard where deployed processes can viewed at a glance and details of running processes are displayed by the process instance ids. Clicking on a process instance id reveals runtime details while the process is running – process variables, task variables etc. If there is an error thrown from any of the services in a flow, the Cockpit application allows “retrying” the service manually after the coded automatic-retries have completed.

The steps below show how to retry a failed process from the Cockpit administrative console

Manual Retry

From the Process Detail Screen select a failed process (click on the GUID link)

Then on the Right Hand side you should see under “Runtime” a “semi-circle with arrow” indicating “recycle” or “retry” – click on this

This launches a pop-up with check-boxes with IDs for the failed service tasks and option to replay them. Notice these are only tasks for “this” instance of the “flow”

After the “Retry” is clicked another pop-up indicates the status of the action (as in “was the engine able to process the retry request”. The actual Services may have failed again)

 

Microservices with Docker and Spring Boot

This guide is for someone interesting in quickly building a micro-service using the Spring Boot framework. Works great if you have prior JAX-RS / Jersey experience

Step-by-step guide

Add the steps involved:

  1. Create a Maven Project
    1. Final Structure
    2. Pom ( Attached here )
  2. Include Spring Boot Dependencies in your pom
    1. Parent Project
      	<parent>
      
      		<groupId>org.springframework.boot</groupId>
      
      		<artifactId>spring-boot-starter-parent</artifactId>
      
      		<version>1.3.1.RELEASE</version>
      
      	</parent>
    2. Dependencies
      <dependencies>
      <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <scope>test</scope>
      </dependency>

      <dependency>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-web</artifactId>
      </dependency>

      <dependency>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-jersey</artifactId>
      </dependency>
      </dependencies>

  3. Create the Java main class – this is entry point with the @SpringBootApplication annotation
    1. App.java
      @SpringBootApplication
      public class App 
      {
          public static void main( String[] args )
          {
          	SpringApplication.run(App.class, args);
          }
      }
  4. Create the API Controller class
    1. Add Spring MVC methods for REST in the API controller class
      1. SimpleAPIController.java with SpringMVC only
        @Component
        @RestController
        @Path("/api")
        public class SimpleAPIController {
        	@RequestMapping(value = "/springmvc", produces = "application/json")
            public Map<String, Object> springMvc() {
        		Map<String, Object> data = new HashMap<String, Object>();
        		data.put("message", "Spring MVC Implementation of API is available");
        		return data;
            }
        }
    2. (Alternate/Additional) Add Jersery methods for REST in the API controller class
      1. SimpleAPIController.java with Jersey (Note: You need the spring boot jersey dependency in your pom)
        @Component
        @RestController
        @Path("/api")
        public class SimpleAPIController {
        	@GET
        	@Produces({ "application/json" })
        	public Response getMessage() {
        		Map<String, Object> data = new HashMap<String, Object>();
        		data.put("message", "Jersey Implementation of API is available");
        		Response response = Response.ok().entity(data).build();
        		return response;
        	}
        }
    3. (Alternate/Additional) Create a JerseyConfig class
      1. JerseryConfig.java – required for jersey rest implementation
        @Configuration
        public class JerseyConfig extends ResourceConfig {
            public JerseyConfig() {
                register(SimpleAPIController.class);
            }
        }
  5. Build the Maven project
    1. Add the Spring boot maven build plugin
      	<build>
      		<plugins>
      			<plugin>
      				<groupId>org.springframework.boot</groupId>
      				<artifactId>spring-boot-maven-plugin</artifactId>
      			</plugin>
      		</plugins>
      	</build>
  6. Execute the spring boot application
    1. Run in Eclipse as Java Application
    2. Run as a standalone Java Jar
      java -jar target/SimpleService-0.0.1-SNAPSHOT.jar
    3. Specify a different port and run as a standalone Java Jar

      java -Dserver.port=8090 -jar target/SimpleService-0.0.1-SNAPSHOT.jar
    4. Running in a docker container

      1. See: https://spring.io/guides/gs/spring-boot-docker/
      2. Docker file
        FROM java:8
        VOLUME /tmp
        ADD SimpleService-0.0.1-SNAPSHOT.jar app.jar
        RUN bash -c 'touch /app.jar'
        ENTRYPOINT ["java","-Djava.security.egd=file:/dev/./urandom","-jar","/app.jar"]

 

The official how-to is here https://spring.io/guides/gs/rest-service/#use-maven and a great blog post is herehttp://blog.codeleak.pl/2015/01/getting-started-with-jersey-and-spring.html

Dockerized Java Application Performance Analysis

Using JMX to analyse a Java Virtual Machine (JVM) within a local or remote docker image. The example below explores how to analyse a Tomcat Server (version 7) running in a docker instance using Oracle JRockit Mission Control (JMC)

Screen Shot 2015-12-12 at 12.46.11 am

Your JRockit Mission Control is here

  • OSX : /Library/Java/JavaVirtualMachines/{JDK}/Contents/Home/bin/”
  • Windows: “JDK_HOME/bin/”

 

Step-by-step guide

Add the steps involved:

  1. Determine your docker machines IP, for example on our machine we did
    1. Do `docker-machine active` to see the active machines 

      1. On my machine this returned `default` as the machine name

    2. Do `docker-machine ip default` to see the IP of the machine named ‘default`

      1. On my machine this returned `192.168.99.100`

        Screen Shot 2015-12-12 at 12.39.53 am.png

  2. Create a setenv.sh script with the JMX options
    1. Set the `CATALINA_OPTS` env variable as shown below
    2. Set the “java.rmi.server.hostname” to the IP obtained in step (1.b) above
    3. Please note that we are not using SSL nor JMX Remote Authentication – so the config below is for DEV only 
  3. Use the setenv.sh script within the tomcat runtime
    1. In your Dockerfile you can do
      1. `ADD setenv.sh /apache-tomcat-7.0.62/bin/setenv.sh`  when using an Apache Tomcat image
      2. or `ADD setenv.sh /camunda/bin/setenv.sh` when using Camunda (embedded or standalone)

    2. You can add this at runtime as well using
      1. `-v ${PWD}/setenv.sh:/apache-tomcat-7.0.62/bin/setenv.sh`when using an Apache Tomcat image
      2. or `-v ${PWD}/setenv.sh:/camunda/bin/setenv.sh` when using Camunda (embedded or standalone)
  4. Start the Docker container and map JMX ports
    1. To the `docker run` command add the following ports
      1. The JMX Remote and RMI Port  `-p 1898:1898`
      2. and  TCP Transport Port `-p 62911:62911`
  5. Validate the Docker container is exposing the ports
    1. Do a `docker ps ` and example your image’s ports
    2. Do a `docker exec -i -t` and validate the contents of your setenv.sh file to confirm the  CATALINA_OPTS  are set
      1. NOT doing will throw you off if the setenv.sh is not copied! Simple exposing the Docker container ports does not mean the JVM allows JMX connections
      2. Example: Docker Exec command to view the contents of the file for apache-tomcat-7.0.62 
  6. Run the JRockit Mission Control
    1. Comes with the JDK
    2. On a Mac you can find it here “/Library/Java/JavaVirtualMachines/{JDK}/Contents/Home/bin/”  and the executable is called “jmc”



  7. Add a Remote connection in JMC

    1. File – > Connect
    2. Provide the JMX Connection properties
      1. Host is the Docker Machine IP we obtained in Step 1 above
      2. Port is the JMX Remote Port, for example: 1898 in Step 2 above
      3. Name the Connection “Docker-<image-name>”
      4. Test the Connection

       

    3. Your JVM shows up in JRockit
  8. Start JMX Console and view JVM metrics

 

See more details on how to use the JRockit JVM tool here

 

API Performance Testing with Gatling

The Problem

“Our API Gateway is falling over and we expect a 6-fold increase in our client base and a 10-fold increase in requests, our backend service is scaling and performing well. Please help us understand and fix the API Gateway” 

Tasks

It was pretty clear we had to run a series of performance tests simulating the current and new user load, then apply fixes to the product (API Gateway), rinse and repeat until we have met the client’s objectives.

So the key tasks were:

  1. Do some Baseline tests to compare the API Gateway Proxy to the Backend API
  2. Use tools to monitor the API Gateway under load
  3. Tune the API Gateway to Scale under Increasing Load
  4. Repeat until finely tuned

My experience with API performance testing was limited (Java Clients) so I reached out to my colleagues and got a leg up from our co-founder (smart techie) who created a simple “Maven-Gatling-Scala Project” in Eclipse to enable me to run these tests.

Gatling + Eclipse + Maven

  • What is Gatling? Read more here: http://gatling.io/docs/2.1.7/quickstart.html
  • Creating an Eclipse Project Structure
    • Create a Maven Project
    • Create a Scala Test under /src/test/scala
    • Create galting.conf , recorder.conf and a logback.xml under /src/test/resources
    • Begin writing your Performance Test Scenario
  • How do we create Performance Test Scenarios?
    • Engine.scala – Must be a Scala App extension
    • Recorder.scala – Must be a Scala App
    • Test – Must extend Gatling Simulation, we created an AbstractClass and extended it in our test cases so we can reuse some common variables
    • gatling.conf – This file contains important configuration which determine where the test results go, how to make the http calls etc See more details here http://gatling.io/docs/2.0.0-RC2/general/configuration.html

Here is a screenshot of my workspace

Screen Shot 2015-10-21 at 11.31.07 am

Executing the Gatling Scala Test

  • Use Maven Gatling Plugin See https://github.com/gatling/gatling-maven-plugin-demo
  • Make sure your pom file has the gatling-maven-plugin artifact as shown in the screen shot below
  • Use the option “-DSimulationClass” to specify the Test you want to run (For Example: -DSimulationClass=MyAPIOneTest)
  • Use the option “-Dgatling.core.outputDirectoryBaseName” to output your reports to this folder

Screen Shot 2015-10-21 at 11.53.50 am


Creating A Performance Test Scenario

Our starting point to do any sort of technical analysis of the API gateway was to study what it did over a varying load and build a baseline. Also this had to be done by invoking a few of APIs during the load to simulate varying requests per second (For example: One api is invoked every 5 seconds while another is done every 10 seconds).

After reading the documentation for Gatling and trying out a couple of simple tests, we were able to replicate our requirements of ‘n’ users by doing 2 scenarios executing http invocations to two different API calls with different ‘pause’ times calculated from the ‘once every x second’ requirement. Each scenario took ‘n/2’ users to simulate the ‘agents’ / ‘connections’ making different calls and each scenario was run in parallel.

  • We used a simple spreadsheet to put down our test ranges, write down what we will vary, what we will keep constant and our expectations for the total number of Requests during the Test duration and expectations for the Requests Per Second we expect Gatling to achieve per test
  • We used the first few rows on the spreadsheet to do a dry run and then matched the Gatling Report results to our expectations in the spreadsheet and confirmed we were on the right track – that is, as we increased the user base our number for RPS and how the APIs were invoked matched our expectation
  • We then coded the “mvn gatling:execute” calls into a run script which would take in User count, Test Duration and Hold times as arguments and also coded a Test Suite Run Script to run the first script using values from the Spreadsheet Rows
  • We assimilated  the Gatling Reports into a Reports folder and served it via a simple HTTP Server
  • We did extensive analysis of the Baseline Results, monitored our target product using tools (VisualVM, Oracle JRockit), did various tuning (that is another blog post) and re-ran the tests until we were able to see the product scale better under increasing load

Performance Test Setup

The setup has 3 AWS instances created to replicate the Backend API and the API Gateway and host the Performance tester (Gatling code). We also use Nodejs to server the Gatling reports though a static folder mapping, that is during each run the Gatling reports are generated onto a ‘reports’ folder mapped under the ‘public’ folder of a simple http server written in Node/Express framework.

Perf Test Setup

API Performance Testing Toolkit

The Scala code for the Gatling tests is as follows – notice “testScenario” and “testScenario2” pause for 3.40 seconds and 6.80 seconds and the Gatling Setup has these two scenarios in parallel. See http://gatling.io/docs/2.0.0/general/simulation_setup.html

Screen Shot 2015-10-20 at 1.56.45 pm

Results

  • 2 user tests reveal we coded for the scenario right and calls are made to the different APIs during differing time periods as shown below

Screen Shot 2015-10-20 at 1.56.22 pm Screen Shot 2015-10-20 at 2.26.50 pm

  • As we build our scenarios, we watch the users ramp up as expected and performance of the backend and gateway degrade as expected with more and more users
    • Baseline testing

 Screen Shot 2015-10-21 at 12.12.08 pm

    • Before tuning

Screen Shot 2015-10-20 at 2.00.08 pm

  • Finally we use this information to Tune the components (change Garbage collector, add a load balancer etc) and retest until we meet and exceed the desired results
    • After tuning

Screen Shot 2015-10-20 at 1.59.17 pm

Conclusion

  • Scala/Gatling

Groovy Grape Turns Sour – java.lang.RuntimeException: Error grabbing Grapes — [download failed:

Issue:

Trying to run a Ratpack Groovy code [ @Grab(‘io.ratpack:ratpack-groovy:1.0.0’) ]

groovy -Dgroovy.grape.report.downloads=true -Dratpack.port=8081 server.groovy

The Error:

General Error:  

java.lang.RuntimeException: Error grabbing Grapes -- [download failed:

Specific Error:

java.lang.RuntimeException: Error grabbing Grapes -- [download failed: org.yaml#snakeyaml;1.12!snakeyaml.jar(bundle), download failed: com.google.guava#guava;18.0!guava.jar(bundle)]

The fix:

Delete the Repository folders in repo managers like Maven.

Why? Because the ~/.groovy/grapes repo for the dependency has the property file configured to read from Maven Repo instead of the Remote Repo (cannot download because it is not a HTTP link)

Detailed Steps:

  1. Step 1:  Use Grape Resolve to load the dependency and show the error

    “grape resolve  com.google.guava guava 18.0″

  2. Step 2:  Find the dependency property and figure out where this is pulling from

    cat ~/.groovy/grapes/com.google.guava/guava/ivydata-18.0.properties

    #ivy cached data file for com.google.guava#guava;18.0
    #Thu Oct 01 15:49:19 AEST 2015
    artifact\:ivy\#ivy\#xml\#1163610380.original=artifact\:guava\#pom.original\#pom\#361716139
    artifact\:guava\#pom.original\#pom\#361716139.original=artifact\:guava\#pom.original\#pom\#361716139
    resolver=localm2
    artifact\:guava\#pom.original\#pom\#361716139.exists=true
    artifact\:ivy\#ivy\#xml\#1163610380.exists=true
    artifact\:ivy\#ivy\#xml\#1163610380.location=file\:/Users/alokmishra/.m2/repository/com/google/guava/guava/18.0/guava-18.0.pom
    artifact.resolver=localm2
    artifact\:guava\#pom.original\#pom\#361716139.location=file\:/Users/alokmishra/.m2/repository/com/google/guava/guava/18.0/guava-18.0.pom
    artifact\:ivy\#ivy\#xml\#1163610380.is-local=true
    artifact\:guava\#pom.original\#pom\#361716139.is-local=true
  3. Step 3: Delete from Maven
    1. Locate the Dependency in your ~/.m2/repository
    2. Delete the repository for this Dependency
      1. rm -r -f /Users/alokmishra/.m2/repository/com/google/guava/
    3. Delete the repository from Groovy
      1.  rm -r -f /Users/alokmishra/.groovy/grapes/com.google.guava/guava/
  4. Step 4: Retest
    1. grape resolve  com.google.guava guava 18.0

      /Users/alokmishra/.groovy/grapes/com.google.guava/guava/jars/guava-18.0.jar
    2. groovy -Dgroovy.grape.report.downloads=true -Dratpack.port=8081 server.groovy

      [SUCCESSFUL ] com.google.guava#guava;18.0!guava.jar(bundle) (4353ms)

Oracle AS Adapter for Peoplesoft: IWAY Error Resolution

ERROR com.ibi.bse.ConfigWorker:java.lang.NoClassDefFoundError: oracle/tip/adapter/api/exception/PCResourceException

Solution:

It appears that Oracle took the IWay Servlet and built a Java Swing application around it that allows you to create a Web service (BSE) or J2CA based connection to the Enterprise Applications (SEIBEL, JDE, Peoplesoft).

This Swing application is launched in a Unix/Linux shell using the iwae.sh script and in Windows using the ae.bat batch file and uses the libraries at the install folder’s lib sub-folder.

After deploying the “iwafjca” RAR and WAR and the “ibse” WAR that came with the installation to my Weblogic server (SOA Domain), I tried to connect to the Servlets, create a BSE or J2CA config and then  connect to an Enterprise Application … however the “Adapter” Node in the Swing tree would not expand and after setting the Option -> Debug on in the swing app, I was able to see in the logs that I had some exceptions due to missing libraries.

I found that the SOA Adapter Libraries for 11g in the SOA Home / modules path need to be copied as show below. Hope this helps!

Copy Missing Libraries from 

${middleware.home}/Oracle_SOA1/soa/modules/oracle.soa.adapter_11.1.1/

To

IWay Install Path

Example: ${middleware.home}/Oracle_SOA1/soa/thirdparty/ApplicationAdapters/lib

 

 

Programmatically Populate View Objects In Oracle ADF

Introduction

I have been recently building ADF UI components and noticed a small difference between Data Controls of Custom Methods in an Application Module vs in a POJO Service. For example, consider the following method “getCountByStatus”

public List<StatusCountTableRow> getCountByStatus() { ... }

when implemented in an POJO Service, you create a data control by right-clicking on the Java service class in JDeveloper and will find the Data Control looks something like this – the fields of the List element are displayed (and when dropped on a JSF page there are two columns, for example, Count and Status)

but when you implement the same in an Application Module you get a “method” binding with “element” returned.

Problem:

The issue is that when you drag and drop the Data Control (DC) created from the POJO you get the columns based on the attributes of the method element. However, when you use the DC from the Application Module – you only get one column in the table component.

An alternative is to use a POJO Service as a wrapper on the Application Module and … wait! Let me stop you there. Bad idea! You will need to track when to release the AM etc and it is better to use the framework components as intended.

Solution:

The answer is to create a View Object to wrap the method call into a programmatic query and associate this VO with the Application Module. When you do this, the data control of the App Module is able to render the View Object’s Row elements into the table component on a JSF page.  (Phew! I hope I said the right).

So this is how it looks like ….

How to steps:

In my example, I have a query that does a

"Select count(*) from <table> where status=:bindStatus"

and gets a “COUNT by STATUS” from the table. I have a list of STATUS which I need to iterate over and build a list which contains the “status” and “count” values.

Notice how the “status” and “count” are in a class called

"StatusCountTableRow"

and the

"UserStatusData"

class holds a

"List<StatusCountTableRow>".

I also expose the “Iterator” for the list in the UserStatusData class so as to allow a client to get “next()” and “size()” etc.

Now we are ready to create the Read Only View Object – programmatically … create an ADF View Object and select the following option

Next create the fields and assign the ROVO to the Application Module (easy steps)

Finally edit the View Object Impl ….


…. and override the following methods:

Override “Create”:

Obviously called when VO Instance is created, calls super.create() and nullifies the Query elements that are required for a SQL based VO

[See: http://mucahiduslu.blogspot.com/2008/05/programmatic-view-object-using-ref.html]

protected void create() {
   super.create();
   getViewDef().setQuery(null);
   getViewDef().setSelectClause(null);
   setQuery(null);
 }

Override “executeQueryForCollection”:

This method gets the “Query Collection” as the variable “qc” along with query params. I used this spot to execute my  query … i.e. I get an instance of the Application Module and execute the method on it to get the List<StatusCountTableRow> from the AM and create an instance of the “UserStatusData” and set the result from the AM on the new UserStatusData instance …

After that I set the user data on the view object, the method call for “setUserDataForCollection()” takes in objects – so your user data class can have anything, as long as it can iterate over a result set and get you the data.  This is a key part of using the framework … we are caching the User Data using setUserDataForCollection and later we retrieve it using “getUserDataForCollection( )”

[See: http://download.oracle.com/docs/cd/E14004_01/books/SSDev/SSDev_CustomizingModel13.html#wp1031402]

protected void executeQueryForCollection(Object qc, Object[] params,
 int noUserParams) {
   UserStatusData usd = new UserStatusData(getOrderByClause());
   usd.setResultsList(((AssetMgmtAMImpl)getApplicationModule()).getCountsByStatus());
   setUserDataForCollection(qc, usd);
   super.executeQueryForCollection(qc, params, noUserParams);
 }

Override “hasNextForCollection”:

This is important so as to tell the ViewObject when to stop … notice how I do a “getIterator” and not a “getList().getIterator()” – try this and see why for yourself.

protected boolean hasNextForCollection(Object qc) {
   if (((UserStatusData)getUserDataForCollection(qc)).getIterator().hasNext())
     return true;
   else {
     setFetchCompleteForCollection(qc, true);
     return false;
   }
 }

Override “createRowFromResultSet”:

So far we have been dealing with creating the query, storing the result and finding out if we have reached the end or not… but how does the type “StatusCountTableRow” get converted to a Table Row on a page? (Remember our core issue). The magic happens here. This is where we take the “next()” row in the “Iterator” in the “UserData” which we had cached in the framework and “map” the values to the fields we created in the View Object.

See how  “populateAttributeForRow(row, 0, cd.getStatusName())”  is used to map the nth column on the “row” with a value in the “StatusCountTableRow” instance retrieved using iterator.next()

protected ViewRowImpl createRowFromResultSet(Object qc,
 ResultSet resultSet) {
 ViewRowImpl row = createNewRowForCollection(qc);
 StatusCountTableRow nextRow =
 ((UserStatusData)getUserDataForCollection(qc)).getIterator().next();
 populateAttributeForRow(row, 0, nextRow.getStatusName());
 populateAttributeForRow(row, 1, nextRow.getCount() + "");
 return row;
 }

Override “getQueryHitCount”:

Override this method to return the size of the result

public long getQueryHitCount(ViewRowSetImpl viewRowSet) {
 return ((UserStatusData)getUserDataForCollection(viewRowSet.getQueryCollection())).getResultsList().size();
 }

Define/Assign Custom Paylod for your BPEL Human Task

When you create a BPEL Human Task, the payload type is not specified by default. You can specify a simple type from the drop down on the “Data” menu item on the BPEL Task Editor page.

In a BPEL Task, see which menu item to select ("Data") to change the payload type

By default if you have not assigned a menu item then your “payload” element is not defined in your BPEL Task’s Schema. The BPEL Human Task references the “ApproveTaskWorkflowTask” schema, which has an element called “Payload” which is imported from “ApproveTaskPayload.xsd” schema definition file.


When the payload is not defined the structure is show below in the subsequent images…. so in the “ApproveTaskWorkflowTask.xsd” we have an empty payload

…and in the “ApproveTaskPayload.xsd” we have

I created a simple schema for my Approval Process where the top-level element is “TimeSheet” (which as a list of TimeSheetDays, which have details in them).  I wanted to use this schema as a payload….

I will use this in the BPEL Human Task by choosing “Other Payload” in my “Data” menu item

Select the schema you want …

Assign a name to refer to this payload type by … you should see the Task -> Payload -> <name-you-gave> as the structure.

End result … your Task has your user defined type for a payload. Now you can add data to this from a BPEL flow or an external call (convert your Human Task to an invoke-able “Composite with Service Bindings” by selecting that option when creating the task).

To use the data you do the following:

Optus’ Samsung Galaxy S – Kies Download mgr now has FROYO!!!

A couple of days back when I got the new phone from Optus – a shiny Samsung Galaxy S, it came with a Kies version that said “No Firmware Updates Available” and the firmware version was Android 2.1 ….

…the quest for Froyo ended last night when I checked for updates on Kier (Settings -> Updates) and it downloaded a bunch of new drivers etc and on connecting the phone said “Updates Available”. It was easy from there on, simply backed up the apps and reflashed the hardware with the new OS using the download manager.

Only problem was with the Sync with Gmail/FB … had to uninstall FB and reinstall it. Could be the 2.1 client was not compatible with 2.2. Other than that everything was smooth.

oh! Last minute add: * Sadface* I used to love the “Fall Leaves in a stream” Live Wallpaper and looks like its no longer there. *Sigh* atleast I have Skype now!!