Wednesday, 20 August 2014

JAXB Error : FWK005 parse may not be called while parsing

org.xml.sax.SAXException: FWK005 parse may not be called while parsing.
Above Error occurs because you might have SchemaFactory instance , which is shared by multiple threads. Like declaring it at class level , and using it in methods .
private SchemaFactory factory = SchemaFactory.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI);
The SchemaFactory is not thread safe . It should always be initialized and accessed in local scope. So whenever you need SchemaFactory instance , just create it locally instead of using it at instance level.

Post Comments And Suggestions !!

Tuesday, 19 August 2014

How To Link Git Tag with Rails app

I use Git tags to manage the version numbers of my Rails apps.

Every time a new version is ready, I tag the current commit like this :
git tag -a v1.10 -m "FIXED PC-43/67/78"
I have created a ruby file in initializers which defines a constant to hold this information (in this case “v1.10”):
APP_VERSION = `git describe --always` unless defined? APP_VERSION
This constant simply contains the output of Git describe.
Now I can use it anywhere in my app where I would like to display the version number.

This post is written by Santosh.

Post Comments And Suggestions

Sunday, 17 August 2014

How to Print Full Stack Trace In Mule Flow

To print full exception stack trace in mule , you need to add a grrovy scripting element like this :
<?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:scripting=""
 xmlns:http="" xmlns="" xmlns:doc="" xmlns:spring="" version="CE-3.3.1" xmlns:xsi="" xsi:schemaLocation=" ">
    <flow name="muleMavenSampleFlow1" doc:name="muleMavenSampleFlow1">
        <http:inbound-endpoint exchange-pattern="request-response" host="localhost" port="8081" doc:name="HTTP"/>
        <catch-exception-strategy doc:name="Catch Exception Strategy">
         <scripting:transformer doc:name="Print StackTrace">
    <scripting:script engine="Groovy">
                    <scripting:text><![CDATA[org.apache.log4j.Logger.getLogger("").error("Error Trace is : \n",exception)]]></scripting:text>

Post Comments And Suggestions !!

Wednesday, 16 July 2014

Mule Cron Job : Job Name Can Not be Empty Error

Writing cron job is very easy in mule.You might get error like this in mule application start up :
Caused by: org.mule.api.lifecycle.LifecycleException: Failed to start inbound endpoint "endpoint.quartz.test.job"
 at org.mule.endpoint.DefaultInboundEndpoint.start(
 at org.mule.construct.AbstractFlowConstruct.startIfStartable(
 at org.mule.construct.AbstractPipeline.doStart(
 at org.mule.construct.AbstractFlowConstruct$2.onTransition(
 at org.mule.construct.AbstractFlowConstruct$2.onTransition(
 at org.mule.lifecycle.AbstractLifecycleManager.invokePhase(
 at org.mule.construct.FlowConstructLifecycleManager.fireStartPhase(
 at org.mule.construct.AbstractFlowConstruct.start(
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(
 at java.lang.reflect.Method.invoke(
 at org.mule.lifecycle.phases.DefaultLifecyclePhase.applyLifecycle(
 at org.mule.lifecycle.RegistryLifecycleManager$RegistryLifecycleCallback.onTransition(
 at org.mule.lifecycle.RegistryLifecycleManager.invokePhase(
 at org.mule.lifecycle.RegistryLifecycleManager.fireLifecycle(
 at org.mule.registry.AbstractRegistryBroker.fireLifecycle(
 at org.mule.registry.MuleRegistryHelper.fireLifecycle(
 at org.mule.lifecycle.MuleContextLifecycleManager$MuleContextLifecycleCallback.onTransition(
 at org.mule.lifecycle.MuleContextLifecycleManager$MuleContextLifecycleCallback.onTransition(
 at org.mule.lifecycle.MuleContextLifecycleManager.invokePhase(
 at org.mule.lifecycle.MuleContextLifecycleManager.fireLifecycle(
 at org.mule.DefaultMuleContext.start(
 at org.mule.module.launcher.application.DefaultMuleApplication.start(
 ... 4 more
Caused by: org.mule.api.lifecycle.LifecycleException: Failed to start Quartz receiver
 at org.mule.lifecycle.AbstractLifecycleManager.invokePhase(
 at org.mule.transport.ConnectableLifecycleManager.fireStartPhase(
 at org.mule.transport.AbstractTransportMessageHandler.start(
 at org.mule.transport.AbstractConnector.registerListener(
 at org.mule.endpoint.DefaultInboundEndpoint.start(
 ... 27 more
Caused by: org.mule.api.endpoint.EndpointException: Failed to start Quartz receiver
 at org.mule.transport.quartz.QuartzMessageReceiver.doStart(
 at org.mule.transport.AbstractMessageReceiver.doStartHandler(
 at org.mule.transport.AbstractTransportMessageHandler$3.onTransition(
 at org.mule.lifecycle.AbstractLifecycleManager.invokePhase(
 ... 31 more
Caused by: java.lang.IllegalArgumentException: Job name cannot be empty.
 at org.quartz.JobDetail.setName(
 at org.mule.transport.quartz.QuartzMessageReceiver.doStart(
 ... 34 more

Here's the xml file :
<?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:quartz="" xmlns="" xmlns:doc="" xmlns:spring="" version="CE-3.3.1" xmlns:xsi="" xsi:schemaLocation=" ">
    <flow name="crontestFlow1" doc:name="crontestFlow1">
        <quartz:inbound-endpoint jobName="test_job" cronExpression="0 0/100 * * * ?" responseTimeout="10000" doc:name="Price File Report Quartz Job" >
        <logger level="INFO" doc:name="Logger"/>

Notice that your jobName is not empty . The issue is with underscore . you can not have underscore in jobName property. Just remove underscore from jobName and it will run fine.You can not have white space characters in job name either

Here's the bug.

Post Comments And Suggestions !!

Wednesday, 25 June 2014

Mule File Inbound And Threads

Requirement : File inbound endpoint should pick only limited number of files , once these files are processed then only file inbound should pick other files in the directory .

For this situation , we can use processing strategy as synchronous in our flow . This strategy will pick only one file at a time which is not good .

If we set receiver threading profile on file inbound , it also does not work because file poller keeps acquiring the files after the time period, while some files are still being processed . The other solution is to use fork and join pattern of mule which is explained below .

First we create a custom file component which will give us the list of required number of files .Number of files will be configurable.
package com.javaroots;
import java.util.ArrayList;
import java.util.List;

import org.mule.api.MuleEventContext;
import org.mule.api.lifecycle.Callable;

public class FilePollerComponent implements Callable

 private String pollDir ;
 private int numberOfFiles;
 public String getPollDir()
  return pollDir;

 public void setPollDir(String pollDir)
  this.pollDir = pollDir;

 public int getNumberOfFiles()
  return numberOfFiles;

 public void setNumberOfFiles(int numberOfFiles)
  if(numberOfFiles < 1 )
   throw new RuntimeException("Number of files can not be less than 1");
  this.numberOfFiles = numberOfFiles;

 public Object onCall(MuleEventContext eventContext) throws Exception
  File f = new File(pollDir);
  List filesToReturn = new ArrayList(numberOfFiles);
   File[] files = f.listFiles();
   int i = 0;
   for(File file : files)
     break ;
   throw new Exception("Invalid Directory");
  return filesToReturn;

And the flow will be like this :
<?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:vm=""
 xmlns="" xmlns:doc="" xmlns:spring="" version="CE-3.3.1" xmlns:xsi="" xsi:schemaLocation=" ">

<spring:bean id="filePoller" class="com.javaroots.FilePollerComponent">
<spring:property name="pollDir" value="E:/fileTest"></spring:property>
<spring:property name="numberOfFiles" value="3"></spring:property>



<flow name="fileInboundTestFlow" doc:name="fileInboundTestFlow" processingStrategy="synchronous">
     <poll frequency="200000">
      <component doc:name="File Poller">
       <spring-object bean="filePoller"/>
     <logger message="Size of payload is : #[message.payload.size()]" level="INFO"/>
       <choice doc:name="Choice"> 
                <when expression="#[message.payload.size() &gt; 0]"> 
       <request-reply >
      <vm:outbound-endpoint path="out" >
        <collection-splitter />
      <vm:inbound-endpoint path="response">
        <add-message-property key="MULE_CORRELATION_GROUP_SIZE" value="3" />
     <logger />
    <flow name="processor" >
     <vm:inbound-endpoint path="out"/>
      <component class="com.javaroots.SleepComponent" doc:name="sleep"/>
      <vm:outbound-endpoint path="response"/>


In the flow , first we put our custom file poller component which will give us list of files not more than 3 . Then we create request reply which will process this list , and it will wait untill all three files are processed . The actual processing of each individual file will happen in another flow , where i have put a sleep component .Request reply passes each payload to vm outbound and at the same vm , the other flow will listen .

Once the flow completes its processing , it returns the response to request reply flow . Collection aggregator blocks untill response from all the three files comes back .So in this way we can control number of files to be processed .

make sure that MULE_CORRELATION_GROUP_SIZE is equal to numberOfFiles which we want to be processed concurrently.

This solution is provided by David Dossot on my stack overflow question

You can see the full source code here

Post Comments and Suggestions !!!

Thursday, 19 June 2014

Validate Xml Against XSD from ClassPath

I have described how to validate xml against schema in another post. Now if you have schema files in your classpath , and if one schema is dependent on other , then you have to do following things to validate it.

Create a custom ResourceResolver like this .The prefix is for those schema files which are not in root of classpath . suppose your schema files contained in a folder named schemas in classpath , then provide prefix as /schema.
package com.acs.plum.web.service.util.xml.validator;


public class ResourceResolver  implements LSResourceResolver {

private String prefix ; 
public LSInput resolveResource(String type, String namespaceURI,
        String publicId, String systemId, String baseURI) {
 systemId = prefix +"/" + systemId; 
    InputStream resourceAsStream = this.getClass().getResourceAsStream(systemId);
    return new CustomLSInput(publicId, systemId, resourceAsStream);
 * @return the prefix
public String getPrefix() {
 return prefix;
 * @param prefix the prefix to set
public void setPrefix(String prefix) {
 this.prefix = prefix;


Create CustomLSInput class .


public class MyCustomLSInput implements LSInput {

private String publicId;

private String systemId;

public String getPublicId() {
    return publicId;

public void setPublicId(String publicId) {
    this.publicId = publicId;

public String getBaseURI() {
    return null;

public InputStream getByteStream() {
    return null;

public boolean getCertifiedText() {
    return false;

public Reader getCharacterStream() {
    return null;

public String getEncoding() {
    return null;

public String getStringData() {
    synchronized (inputStream) {
        try {
            byte[] input = new byte[inputStream.available()];
            String contents = new String(input);
            return contents;
        } catch (IOException e) {
        return null ;

public void setBaseURI(String baseURI) {

public void setByteStream(InputStream byteStream) {

public void setCertifiedText(boolean certifiedText) {

public void setCharacterStream(Reader characterStream) {

public void setEncoding(String encoding) {

public void setStringData(String stringData) {

public String getSystemId() {
    return systemId;

public void setSystemId(String systemId) {
    this.systemId = systemId;

public BufferedInputStream getInputStream() {
    return inputStream;

public void setInputStream(BufferedInputStream inputStream) {
    this.inputStream = inputStream;

private BufferedInputStream inputStream;

public MyCustomLSInput(String publicId, String sysId, InputStream input) {
    this.publicId = publicId;
    this.systemId = sysId;
    this.inputStream = new BufferedInputStream(input);
Now in your main class set resource resolver with proper prefix.

import javax.xml.validation.Schema;
import javax.xml.validation.SchemaFactory;
import javax.xml.validation.Validator;

import org.xml.sax.SAXException;

public class XmlValidator
 public static void main(String[] args)
  try {
      String schemaLang = "";

      SchemaFactory factory = SchemaFactory.newInstance(schemaLang);
       ResourceResolver resolver = new ResourceResolver();
       resolver.setPrefix("/javaroots");//set prefix if your schema is not in the root of classpath

      factory.setResourceResolver(new ResourceResolver());
      Schema schema = factory.newSchema(new StreamSource(XmlValidator.class.getResourceAsStream("/MarketLive.xsd")));
      Validator validator = schema.newValidator();

      validator.validate(new StreamSource("D:\\sportschalet\\SchemaNSamples-5.9.8\\test\\sample_300189_3038341.xml"));
      System.out.println("Successfully validated");

  } catch (SAXException e) {
  } catch (Exception ex) {
Thats it !! now you do not need to copy schemas in a separate folder . You can have them in your classpath .

Post comments and Suggestions !!!
StackOverFlow Ref

Thursday, 12 June 2014

Git : How to add commit in between old commits

I have a Git repository and need to rewrite my local history by inserting a new commit in between old commits.

More specifically my sitatution is like this:
and I wanted to come up with something like this:
Where SA is my new commit i.e to be inserted b/w commit BC & CD.

Well, it isn’t actually an extreme or difficult case.It’s actually a very simple procedure to follow:
$ git checkout master
$ git checkout -b temp BC 
$ git add
$ git commit # your changes that will be SA
Now your Repo will look like this :
  AB—BC—SA  temp
After this repository layout it’s rather simple to transform it into a single sequence of commits:
 $ git rebase temp master
You may get few conflicts that you need to resolve .

  Now you are all Done !!!


You will notice that your SHA keys are modified and tag doesn't appear above commit BC to make sure your tags are in line follow the steps:
 $ git tag -l #to list all your tags.
For each tag type the following command,
 $ git show TAG_NAME
to see the details of the old commit.

Make note of the subject line, date, and hash of the old commit.
Page through git log looking for that subject line and date. Make note of the hash of the new commit when you find it.
 $ git tag --force TAG_NAME NEW_COMMIT_HASH #to update the tag.
Hope that you have not mistaken one commit for another with a similar subject line and date.

Thanks to Santosh Mohanty for writing this post .

Post Comments And Suggestions !!!