Deployment best practices

I am currently building, using spring boot, a workflow micro service that uses flowable as its engine. Here is my current procedure for getting processes into the engine:
1-I build the process using a local copy of flowable designer running on my machine.
2-I then export the process from designer and copy the resulting bpmn file into the resources/processes directory in my eclipse based project.
3-When I restart my service, spring boot recognizes that there is a new file and performs a deployment.

When needing to modify the process, I then import it into my local copy of the modeler, make changes and then go through the steps of exporting and copying into eclipse again. I do this to ensure that nobody else has modified the file since the last time I worked on it.

In my opinion, these workflows (bpmn files) are part of the source code. They should live alongside the rest of the source code and be version controlled the same way. The biggest reason this is important to me is because our code goes through several environments before it makes it to production (dev, qa, uat, etc). Having the processes with the code is the best way to ensure that the correct versions make it into the releases. I do not want to start copying over database tables as we deploy through the various environments. Our deployments are automated through Jenkins and need to find an elegant solution with little manual intervention.

I have considered having the modeler available in a central location for everyone to use. I would then write a job to fetch all of the processes, package that up and add it to the release artifact. This would solve some problems but would also cause a number of other issues.

I am sure I am not the first to encounter this and was wondering how others have solved this particular problem. Would be interested in thoughts and ideas that could help alleviate this situation.

Thanks,

Stephane

It’s indeed a problem that is often heard. But the answer also depends on the kind of company, habits, infrastructure and use cases you’re building. So there definitely is no ‘one size fits all’. Having the models as part of the source code is definitely something we’ve seen a lot and is a good practice. Now, typically these processes are built by both developers and business analysts, so a central location is needed for that.
Your last suggestion of a central modeler is not a bad one - however, you’d need a way to tag models to be ‘ready’ (for adding tech stuff, for UAT, for production, etc). All of that can be automated through the REST api’s (or your own REST api’s), as part of the build and could even be copied in special folders in the source code during those same builds.

It would be good if there was an integration so the model repository could be git or other code repo.

1 Like

You are correct in stating that there are many ways to accomplish this task and different organizations, depending on their practices, will handle things differently. In asking the question, I was hoping that some people would share their approach to solving this problem. I would rather borrow than re-invent :slight_smile:

Thanks for the input!

Stephane

I use spring boot autodeploy from resource/processes location, same as you. For simple processes I still use flowable-designer eclipse plugin. Unfortunatelly it’s far from ideal. It’s no longer developed. It lacks new features, it can be incompatible with designer. Nevertheless it’s much faster for a developer to play with and test flows directly within eclipse. I hope flowable-designer will get some attention in future releases.

I completely agree with you that the processes should be version controlled, in the same way as code is.

Hello,
is there something new about this thread?

What is finally your deployment strategy?

JC