I have process that has a sub process multi-instance. I ran a sample execution that generated 2000 sub processes. The sub process took over an hour to complete after the last activity.
Sample Logs:
2017-11-20 08:21:05.217 INFO 14498 — [nio-8080-exec-3] log output from last sub process activity
2017-11-20 09:31:49.512 INFO 14498 — [nio-8080-exec-3] log output from first activity after sub process
Is there a way to make this more efficient or does this lag between the last activity in the sub process and the sub process ending get longer and longer with the number of items in the multi-instance?
Really difficult to say without understanding what you are exactly doing in the sub process. I would not expect the Flowable Engine to introduce the lag, so I’m interested in the sub process implementation.
I have since refactored the process to use camel for the sub processing, but initially in the BPMN execution, I had selected 2000 records from a mongoDB and I created a sub process to enrich the data in a sub process.
With 100 records it took about 10 minutes between the last activity and the first activity after the sub process.
with 2000 it took about hour.
with 10000, I gave up and killed the process.
My thought is that the data base is in the middle of a transaction processing all of the sub processes and it takes awhile to commit everything coming out of the sub process.
It is hard to guess, but I would expect that there is a problem with transaction commit. If you split your execution into several smaller transaction, problem could be solved.
I was going through some of the Flowable properties but unfortunately, I am not able to find any property which can help me to commit the transaction after every N records… Could you please help me if you know any property/configuration to do it?
this is not a configuration property. You have to change your model to allow splitting into several transactions. You can use e.g. multiinstance behavior to achieve it.
Another possibility is to use batch service (does not have support in activities now)
The problem is that you can use Collection<Collection<Item>>.
Outer collection can limit transaction and inner collection contains items processed in one transaction. You have to set async flag to true to start a new transaction for each batch.
another possibility is to use back loop with async flag,