Flowable core developers Filip Hrisafov and Joram Barrez continue the Flowable + serverless journey by looking at other technologies that make a serverless “process as a function” possible.
In part 1, they discussed what serverless is, its challenges with regards to the Flowable engines and demonstrated implementations using Flowable with Spring Cloud (including running it on AWS Lambda), Micronaut and GraalVM.
In this follow-up part, they look closer this time at implementing such a functions using Flowable combined with Spring Fu and building a native image with GraalVM. The end result is an incredible 13 milliseconds bootup time for a full-fledged Flowable-powered function!
In the past few months, this has culminated into a clear understanding of the strengths and weaknesses of the Generative AI (GenAI) technology; and where it makes sense to integrate with it and – perhaps more important – where it doesn’t make sense.
As AI gains prominence as a pivotal technology and enterprises increasingly seek to leverage its capabilities, we are actively exploring diverse avenues for integrating AI into process automation.
The key to managing complexity is to combine different and multiple tools leads to better, faster, and more maintainable solutions. For example, combining BPMN with CMMN.