Updating Multiple Processes at Once

So I have found myself with a rather unique problem. I’m pretty familiar with the standard paradigm of “User Clicks Submit, update the Workflow Task” and all the underlying APIs associated with that behavior (task.CompleteTask() being chief among them).

The problem I find myself looking at now is that I have a UI interface where the user will select multiple objects. Each object is backed by its own workflow. When the user selects multiple objects and submits, only the selected objects (upwards of 500 in a single bundle) update their corresponding workflows (think bulk approval of a bunch of tasks). What’s the best way to target a batch update behavior like this? I can only think of two solutions. The first is to loop through each of my 500 objects and individually update the workflow for each object. That’s going to be awful for performance though (JDBC not being a fan of rapid-fire DB calls like that). The other solution I can think of is to use an intermediate event that waits for the signal from the task API. The problem I run into with this idea is that as far as I understand events are broadcast to all workflows. I need a precise update, because while I want to bulk complete 500 workflow tasks, I could have another 200 tasks that I want to stay right where they are.

Since I can’t see the solution, I thought I throw it out here and see if anyone could offer insight. I’m running Flowable 6.4 currently.

If you’re able to upgrade to 6.5.0, it sounds like the new event registry is a good fit for your problem. It allows to have the event broadcasts you mention, but using event correlation only the relevant workflows will actually receive the event (this is done in a performant lookup).

Is there an example in the documentation for using this feature? It’s been a while since I’ve had to do event broadcasts in a workflow (last time I touched it was back in like 2015). I’d need to understand both how to do this from the APIs, but also if it’s possible to have the events sent from one workflow to another. The end goal here is that there’s a single bulk approval workflow that users interact with. When they approve the bulk action, it has the effect of sending that event correlation out to do the bulk update to all the listeners. If I can do it from within the workflow I may not need asynchronous server-side calls to run the status update logic, I can just fire and forget.

Flowable 6.5 upgrade will require a bit of work to achieve, because we currently have flowable 6.4 in the production environment and to go to 6.5 isn’t as simple for us as just flipping the switch (though serious kudos to you at Flowable for making the upgrade itself as painless as possible). It will require us to run the upgrade through an SDLC process to ensure we don’t break anything. But I will mention it and see if the 6.5 upgrade and the new features in there get us across the finish line on this.

Hi Jeff,

The event registry is described here: https://blog.flowable.org/2020/02/08/introducing-the-flowable-event-registry/

It needs an event framework, such as Kafka, RabbitMQ or ActiveMQ. In the next open source release (it’s already in the main codebase), there’s also “internal” events that don’t need an external framework.

Cheers
Paul.

Any idea when the next open source release is slated? There’s going to be significant overhead to try and implement a 3rd party messaging queue, beyond the time it’ll take me to figure out how to do it. If the release is coming soon I may see if it’s worth re-organizing my development time line to wait for the internal events support to show up.

Should be very soon; you won’t need to wait for Christmas.

events are supported in 6.5 which is available since february

there are still some rough edges around it (in particular from designer - for instance, models with event handling can be imported, but import breaks somewhere internally, and although everything looks fine the model does not work in the end - quite annoying gotcha) , but the core part works as intended

Is that internal event framework going to bake in something like RabbitMQ, or is it just going to be an extension of the existing intermediate events capability that is more friendly towards a targeted broadcast instead of a generic broadcast?

Hey @jeff.gehly,

The internal event framework uses the same concepts that we have added in the 6.5. The only difference is that instead of sending something over the wire it happens in memory.

What you can do within 6.4 (and then transform then in the internal event registry in 6.5) is to schedule your own custom job for every action.

In one transaction you can create the 500 jobs and then the rest will be done by Flowable.

e.g.

The custom job handler can look like:

public class CustomJobHandler implements JobHandler {
    public static final String TYPE = "customJobType";
    
    @Override
    public String getType() {
        return TYPE;
    }

    @Override
    public void execute(JobEntity job, String configuration, VariableScope variableScope, CommandContext commandContext) {
        CommandContextUtil.getProcessEngineConfiguration(commandContext).getTaskService().complete(configuration);
    }
}

You can create a job in the following way:

managementService.executeCommand(commandContext -> {
    JobServiceConfiguration jobServiceConfiguration = 
    CommandContextUtil.getJobServiceConfiguration(commandContext);
    JobService jobService = jobServiceConfiguration.getJobService();
    JobEntity job = jobService.createJob();
    job.setJobHandlerType(CustomJobHandler.TYPE);
    job.setConfiguration(taskId);
    jobService.createAsyncJob(job, false);
    jobService.scheduleAsyncJob(job);
});

You’ll need to create all your bulk jobs by iterating over your objects.

You’ll also need to register your custom job handler with the Process engine. If you are using Spring Boot you can do it in the following way:

@Bean
public EngineConfigurationConfigurer<SpringProcessEngineConfiguration> customEngineConfiguration() {
    return engineConfiguration ->. {
        engineConfiguration.addCustomJobHandler(new CustomJobHandler());
    };
}

Cheers,
Filip

This is massively helpful. Just to be sure I’m connecting the dots here, this is a means for me to loop the list of 500 tasks and then spin up an async job that comes along and updates the task to the appropriate step once Flowable has a chance to finish processing?

From a BPMN implementation perspective, I would use a system task hooked to a REST endpoint that when called performs the job creation, and then returns back to the workflow. So I could go “Approve” once on my 500 records and then the system chugs along on them under the hood? I’m just making sure I get the concept here.

And then how would this port over to the internal events once that’s available to me? This looks like a way to achieve what I need in 6.4 without totally killing performance, but if the internal events is going to be more elegant and it’s going to be a pain to port over (or I end up having to rework everything anyway) then it probably makes sense for me to hold off on this feature implementation until I can get access to the internal events on the open source release (and proceed to badger you all over again :wink: ).

Yes the code I shared is one way how you can loop your list and create async jobs. Once the transaction is completed, Flowable will run the jobs asynchronously in different threads.

I don’t know how your UI looks like. It doesn’t have to be done via BPMN and a service task. It can be your own dedicated endpoint that you will call, loop through the tasks and create the jobs. If the bulk approval is already part of a BPMN process that you have, then yes a system task hooked to a REST endpoint would work.

For the internal events, you will also need to model it like that. e.g. using an intermediate catch event or receive event task. For those events you’ll need to have some kind of a correlation id, such that when a matching event comes along it will trigger that flow.

Apart from changing your models you will also need to adapt the CustomJobHandler I shared so that it uses the EventRegistry#setSystemEventOutbound(EventInstance) instead of doing the completion of a user task.

Cheers,
Filip

I was getting into the meat of this and registering my custom job handler with the process engine when I found something rather odd. Is there a reason that StandAloneProcessEngineConfiguration does not extend the ProcessEngineConfigurationImpl abstract class? Because it doesn’t, I must use a different engine configuration (namely one based upon spring) in order to register custom jobs.

Hey @jeff.gehly,

The StandaloneProcessEngineConfiguration does extend from the ProcessEngineConfigurationImple have a look here.

Cheers,
Filip

That’s what I saw as well, but either I’m losing my mind or there’s something funky with the class inheritance. This code will fail:

ProcessEngineConfiguration config = ProcessEngineConfiguration.createStandaloneProcessEngineConfiguration();

	JobHandler packageJob =  new CustomJobHandler();
	List<JobHandler> customHandlers = new ArrayList<JobHandler>();
	customHandlers.add(packageJob);
	
	config.setCustomJobHandlers(customHandlers);

This code will run and not throw an error, but I can’t get the custom job handler to register properly (still trying to sort through how to get that working, it’s really deep into the framework and confusing to puzzle through).

ProcessEngineConfigurationImpl config = (ProcessEngineConfigurationImpl) ProcessEngineConfiguration.createStandaloneProcessEngineConfiguration();

	JobHandler packageJob =  new CustomJobHandler();
	List<JobHandler> customHandlers = new ArrayList<JobHandler>();
	customHandlers.add(packageJob);
	
	config.setCustomJobHandlers(customHandlers);

So what I can’t wrap my head around is why I have to cast it up to the abstract class, when I should have access to the abstract class directly from the child. And this isn’t just Eclipse throwing a syntax error at me, I will get a runtime exception that the method cannot be found.

So at the end of it all, I finally got this working. There seemed to be multiple parts to the problem, and the combination of fixing all the parts got me where I needed to be.

  1. I misspelled the TYPE value in my customJobType.java class file. So I registered the job with Flowable as (cutomJobType), but scheduled it under (customJobType). Flowable therefore went looking for the wrong value in the job queue–I missed the s.

  2. I added logging statements so I could see when I was scheduling and when I was executing the job. This helped me figure out the first problem, because I could see in my command-line demo that I was scheduling the job but not executing it, because the TYPE value didn’t line up to what I expected it to be.

  3. I tried to be cute and do the engine setup in ColdFusion but schedule the job in the Java code I wrote, passing in the constructed engine as a parameter. This turned out to be a no-no. I don’t have the answer as to why, but adjusting my approach to let my Java code do all the work got me where I needed to be. It will remain one of the great mysteries of the programming world, because I have neither the time nor the inclination to delve into the depths of ColdFusion and figure it out. I prefer to imagine a TRON-esque battle in the JVM between ColdFusion and Flowable.

Thank you for posting this. It did bring a smile to my face :wink: