AUGUST 13, 2018

Recently MongoDB released version 4.0 of its open-source NoSQL document-based database. One of the main new features is the support for multi-document transactions ( with true ACID semantics, which means that it’s now possible to make changes to multiple JSON documents in different collections in one transaction. For the Flowable Engines this is super-interesting: the logic of the engines relies often on the transactional behaviour to make changes to multiple executions, tasks, jobs and other entities in the same transaction such that process or case instances atomically change from one state to the next.

Obviously, this made us want to test the multi-document transactions on MongoDB in combination with Flowable. One of the nice low-level features of the Flowable Engines is that the lowest layer of the persistency logic – the so-called “data managers” –  are pluggable and the default implementations based on MyBatis  can be swapped with another implementation. Let’s look at a simple example in the MongoDbProcessDefinitionDataManager that implements the ProcessDefinitionDataManager and therefore replaces the default MyBatisProcessDefinitionDataManager.

This method retrieves the process definition entity based on the unique id:

public ProcessDefinitionEntity findById(String id) {
    return getMongoDbSession().findOne(COLLECTION_PROCESS_DEFINITIONS, id);

As this code snippet shows, the logic in the MongoDB version of the data manager is quite straight-foward. The “real” low-level MongoDB specific persistency logic is implemented in the MongoDbSession class that implements the org.flowable.common.engine.impl.interceptor.Session interface and is created by the MongoDbSessionFactory. This is very similar to the default MyBatis DbSqlSession and DbSqlSessionFactory classes that are part of the common Flowable Engine code and take care of the persistency logic for the relational database. This means that adding MongoDB support to the CMMN and DMN engines is quite easy to do as this leverages the same common services.

When we get an entity from the MongoDB database, we will get back a JSON document which is of type org.bson.Document with the MongoDB Java driver. Because all the layers above the data management layer in the Flowable Engines work with the Entity classes, we need to map from the entity class to a org.bson.Document and vice versa. MongoDB offers some convenience classes to do this for you, but we decided to make this really explicit and implement an EntityMapper class to do this for us. This is a snippet from the ProcessDefinitionEntityMapper class that does this for the ProcessDefinitionEntity mapping logic:

public class ProcessDefinitionEntityMapper implements EntityMapper {

    public ProcessDefinitionEntityImpl fromDocument(Document document) {
        ProcessDefinitionEntityImpl processDefinitionEntity = new ProcessDefinitionEntityImpl();
        return processDefinitionEntity;

    public Document toDocument(ProcessDefinitionEntityImpl processDefinitionEntity) {
        Document processDefinitionDocument = new Document();
        processDefinitionDocument.append("_id", processDefinitionEntity.getId());
        processDefinitionDocument.append("name", processDefinitionEntity.getName());
        return processDefinitionDocument;


For every Entity class in the Flowable Engine an EntityMapper implementation is needed to easily convert between the Java and the MongoDB data structures. So coming back to our findById method example in the ProcessDefinitionDataManager, the MongoDbSession findOne implementation retrieves the org.bson.Document from the MongoDB database using the unique id and then the ProcessDefinitionEntityMapper is used to convert the JSON document into a ProcessDefinitionEntityImpl instance.

With the implementation of the DataManager classes, we need to expose these persistency classes to the process engine configuration. To make the usage convenient we created a MongoDbProcessEngineConfiguration class extending the default ProcessEngineConfigurationImpl configuration class. This is a snippet of the implementation of the initDataManagers method that is overriden:

public void initDataManagers() {
    MongoDbDeploymentDataManager mongoDeploymentDataManager = new MongoDbDeploymentDataManager();
    mongoDbSessionFactory.registerDataManager(MongoDbDeploymentDataManager.COLLECTION_DEPLOYMENT, mongoDeploymentDataManager);
    this.deploymentDataManager = mongoDeploymentDataManager;
    MongoDbProcessDefinitionDataManager mongoDbProcessDefinitionDataManager = new MongoDbProcessDefinitionDataManager();
    mongoDbSessionFactory.registerDataManager(MongoDbProcessDefinitionDataManager.COLLECTION_PROCESS_DEFINITIONS, mongoDbProcessDefinitionDataManager);
    this.processDefinitionDataManager = mongoDbProcessDefinitionDataManager;

Actually (and as you can see in the class) there’s not a lot of magic needed. The above code snippet replaces the MyBatis data managers with the MongoDB equivalent data managers. Of course, a bit more logic is needed in the MongoDbProcessEngineConfiguration implementation, like creating the collections in the MongoDB database at first startup, and creating the MongoDbSessionFactory instance etc. Have a look at the full implementation of the MongoDbProcessEngineConfiguration class and you’ll see there’s not that much logic implemented there. But more importantly, using this configuration is equivalent in use to the default (relational) use case.

Once the configuration is done, all the rest is exactly the same as before: services, concepts, extension points … all are exactly the same!

With the MongoDB engine configuration in place we can now implement a unit test and bootup the process engine. This can be done now with the following code snippet.

new MongoDbProcessEngineConfiguration()
    .setServerAddresses(Arrays.asList(new ServerAddress("localhost", 27017), new ServerAddress("localhost", 27018), 
            new ServerAddress("localhost", 27019)))

Before we can run a unit test we first need to download and boot the MongoDB database. Just download the latest version from and start 3 instances to create a replica set (currently transactions work only on  replica sets according to the documentation):

 ./mongod --port 27017 --dbpath ../data1/ --replSet rs0
 ./mongod --port 27018 --dbpath ../data2/ --replSet rs0
 ./mongod --port 27019 --dbpath ../data3/ --replSet rs0

Then the replicate set can be created (just a one-time action) with:


With MongoDB running we can execute a first unit test that deploys a BPMN XML file with a start event, one user task and an end event.

public void testOneTaskProcess() {
    ProcessInstance processInstance = runtimeService.startProcessInstanceByKey("oneTask");
    Task task = taskService.createTaskQuery().singleResult();
    assertEquals(0, runtimeService.createProcessInstanceQuery().count());

As you can see, this is just a normal unit test like we have many in the BPMN engine test suite. This shows the beauty and power of the persistency pluggability in the Flowable engines: nothing in this example would make you think this is actually running on MongoDB if you hadn’t have seen the setup method.

You can find the full set of unit tests and the MongoDB integration code in the mongodb branch ( here:  It also includes a test that verifies that a transaction actually gets rolled back when an exception happens:

The Flowable MongoDB module can be found here

Currently we have support for user tasks, async and timer jobs, nested sub processes and signal and message events. So this is not a simple prototype, but we’re really on the path to have full BPMN support on MongoDB. Of course, there’s also still plenty of work left to  do, like supporting all query options and extending the implementation to have the same level of BPMN support as the relational database implementation.

We’re really excited about this and are welcoming any feedback on the MongoDB implementation. Furthermore, we’ve also heard in the past quite a few times about people wanting to run Flowable on MongoDB, but the lack of transactions was always a hurdle …  but not anymore!

We also plan to do a performance benchmark like we have done for the MyBatis implementation a number of times in the past. This should give some valuable input to the scalability and performance options that MongoDB can bring to the Flowable Engines and drive further implementations and improvements.

Tijs_Rademakers_MG 8595

Tijs Rademakers

VP Engineering

BPM enthusiast and Flowable project lead.

Share this Blog post
Engineering | JULY 3, 2024
Using AI with Flowable

In the past few months, this has culminated into a clear understanding of the strengths and weaknesses of the Generative AI (GenAI) technology; and where it makes sense to integrate with it and – perhaps more important – where it doesn’t make sense.

Engineering | FEBRUARY 19, 2024
The Value of AI in Modeling

As AI gains prominence as a pivotal technology and enterprises increasingly seek to leverage its capabilities, we are actively exploring diverse avenues for integrating AI into process automation.

Engineering | OCTOBER 3, 2023
Low-code, High Impact: Flowable & CMMN for Complex Use Cases

The key to managing complexity is to combine different and multiple tools leads to better, faster, and more maintainable solutions. For example, combining BPMN with CMMN.